Jan 30 03:24:30 np0005601977 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 30 03:24:30 np0005601977 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 30 03:24:30 np0005601977 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:30 np0005601977 kernel: BIOS-provided physical RAM map:
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 30 03:24:30 np0005601977 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 30 03:24:30 np0005601977 kernel: NX (Execute Disable) protection: active
Jan 30 03:24:30 np0005601977 kernel: APIC: Static calls initialized
Jan 30 03:24:30 np0005601977 kernel: SMBIOS 2.8 present.
Jan 30 03:24:30 np0005601977 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 30 03:24:30 np0005601977 kernel: Hypervisor detected: KVM
Jan 30 03:24:30 np0005601977 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 30 03:24:30 np0005601977 kernel: kvm-clock: using sched offset of 5662314400 cycles
Jan 30 03:24:30 np0005601977 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 30 03:24:30 np0005601977 kernel: tsc: Detected 2800.000 MHz processor
Jan 30 03:24:30 np0005601977 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 30 03:24:30 np0005601977 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 30 03:24:30 np0005601977 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 30 03:24:30 np0005601977 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 30 03:24:30 np0005601977 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 30 03:24:30 np0005601977 kernel: Using GB pages for direct mapping
Jan 30 03:24:30 np0005601977 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 30 03:24:30 np0005601977 kernel: ACPI: Early table checksum verification disabled
Jan 30 03:24:30 np0005601977 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 30 03:24:30 np0005601977 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:30 np0005601977 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:30 np0005601977 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:30 np0005601977 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 30 03:24:30 np0005601977 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:30 np0005601977 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:30 np0005601977 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 30 03:24:30 np0005601977 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 30 03:24:30 np0005601977 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 30 03:24:30 np0005601977 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 30 03:24:30 np0005601977 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 30 03:24:30 np0005601977 kernel: No NUMA configuration found
Jan 30 03:24:30 np0005601977 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 30 03:24:30 np0005601977 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 30 03:24:30 np0005601977 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 30 03:24:30 np0005601977 kernel: Zone ranges:
Jan 30 03:24:30 np0005601977 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 30 03:24:30 np0005601977 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 30 03:24:30 np0005601977 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 30 03:24:30 np0005601977 kernel:  Device   empty
Jan 30 03:24:30 np0005601977 kernel: Movable zone start for each node
Jan 30 03:24:30 np0005601977 kernel: Early memory node ranges
Jan 30 03:24:30 np0005601977 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 30 03:24:30 np0005601977 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 30 03:24:30 np0005601977 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 30 03:24:30 np0005601977 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 30 03:24:30 np0005601977 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 30 03:24:30 np0005601977 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 30 03:24:30 np0005601977 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 30 03:24:30 np0005601977 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 30 03:24:30 np0005601977 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 30 03:24:30 np0005601977 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 30 03:24:30 np0005601977 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 30 03:24:30 np0005601977 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 30 03:24:30 np0005601977 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 30 03:24:30 np0005601977 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 30 03:24:30 np0005601977 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 30 03:24:30 np0005601977 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 30 03:24:30 np0005601977 kernel: TSC deadline timer available
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Max. logical packages:   8
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Max. logical dies:       8
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Max. dies per package:   1
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Max. threads per core:   1
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Num. cores per package:     1
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Num. threads per package:   1
Jan 30 03:24:30 np0005601977 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 30 03:24:30 np0005601977 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 30 03:24:30 np0005601977 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 30 03:24:30 np0005601977 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 30 03:24:30 np0005601977 kernel: Booting paravirtualized kernel on KVM
Jan 30 03:24:30 np0005601977 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 30 03:24:30 np0005601977 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 30 03:24:30 np0005601977 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 30 03:24:30 np0005601977 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 30 03:24:30 np0005601977 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:30 np0005601977 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 30 03:24:30 np0005601977 kernel: random: crng init done
Jan 30 03:24:30 np0005601977 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: Fallback order for Node 0: 0 
Jan 30 03:24:30 np0005601977 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 30 03:24:30 np0005601977 kernel: Policy zone: Normal
Jan 30 03:24:30 np0005601977 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 30 03:24:30 np0005601977 kernel: software IO TLB: area num 8.
Jan 30 03:24:30 np0005601977 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 30 03:24:30 np0005601977 kernel: ftrace: allocating 49438 entries in 194 pages
Jan 30 03:24:30 np0005601977 kernel: ftrace: allocated 194 pages with 3 groups
Jan 30 03:24:30 np0005601977 kernel: Dynamic Preempt: voluntary
Jan 30 03:24:30 np0005601977 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 30 03:24:30 np0005601977 kernel: rcu: #011RCU event tracing is enabled.
Jan 30 03:24:30 np0005601977 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 30 03:24:30 np0005601977 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 30 03:24:30 np0005601977 kernel: #011Rude variant of Tasks RCU enabled.
Jan 30 03:24:30 np0005601977 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 30 03:24:30 np0005601977 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 30 03:24:30 np0005601977 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 30 03:24:30 np0005601977 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:30 np0005601977 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:30 np0005601977 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:30 np0005601977 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 30 03:24:30 np0005601977 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 30 03:24:30 np0005601977 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 30 03:24:30 np0005601977 kernel: Console: colour VGA+ 80x25
Jan 30 03:24:30 np0005601977 kernel: printk: console [ttyS0] enabled
Jan 30 03:24:30 np0005601977 kernel: ACPI: Core revision 20230331
Jan 30 03:24:30 np0005601977 kernel: APIC: Switch to symmetric I/O mode setup
Jan 30 03:24:30 np0005601977 kernel: x2apic enabled
Jan 30 03:24:30 np0005601977 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 30 03:24:30 np0005601977 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 30 03:24:30 np0005601977 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 30 03:24:30 np0005601977 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 30 03:24:30 np0005601977 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 30 03:24:30 np0005601977 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 30 03:24:30 np0005601977 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 30 03:24:30 np0005601977 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 30 03:24:30 np0005601977 kernel: Spectre V2 : Mitigation: Retpolines
Jan 30 03:24:30 np0005601977 kernel: RETBleed: Mitigation: untrained return thunk
Jan 30 03:24:30 np0005601977 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 30 03:24:30 np0005601977 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 30 03:24:30 np0005601977 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 30 03:24:30 np0005601977 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 30 03:24:30 np0005601977 kernel: active return thunk: retbleed_return_thunk
Jan 30 03:24:30 np0005601977 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 30 03:24:30 np0005601977 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 30 03:24:30 np0005601977 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 30 03:24:30 np0005601977 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 30 03:24:30 np0005601977 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 30 03:24:30 np0005601977 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 30 03:24:30 np0005601977 kernel: Freeing SMP alternatives memory: 40K
Jan 30 03:24:30 np0005601977 kernel: pid_max: default: 32768 minimum: 301
Jan 30 03:24:30 np0005601977 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 30 03:24:30 np0005601977 kernel: landlock: Up and running.
Jan 30 03:24:30 np0005601977 kernel: Yama: becoming mindful.
Jan 30 03:24:30 np0005601977 kernel: SELinux:  Initializing.
Jan 30 03:24:30 np0005601977 kernel: LSM support for eBPF active
Jan 30 03:24:30 np0005601977 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 30 03:24:30 np0005601977 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 30 03:24:30 np0005601977 kernel: ... version:                0
Jan 30 03:24:30 np0005601977 kernel: ... bit width:              48
Jan 30 03:24:30 np0005601977 kernel: ... generic registers:      6
Jan 30 03:24:30 np0005601977 kernel: ... value mask:             0000ffffffffffff
Jan 30 03:24:30 np0005601977 kernel: ... max period:             00007fffffffffff
Jan 30 03:24:30 np0005601977 kernel: ... fixed-purpose events:   0
Jan 30 03:24:30 np0005601977 kernel: ... event mask:             000000000000003f
Jan 30 03:24:30 np0005601977 kernel: signal: max sigframe size: 1776
Jan 30 03:24:30 np0005601977 kernel: rcu: Hierarchical SRCU implementation.
Jan 30 03:24:30 np0005601977 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 30 03:24:30 np0005601977 kernel: smp: Bringing up secondary CPUs ...
Jan 30 03:24:30 np0005601977 kernel: smpboot: x86: Booting SMP configuration:
Jan 30 03:24:30 np0005601977 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 30 03:24:30 np0005601977 kernel: smp: Brought up 1 node, 8 CPUs
Jan 30 03:24:30 np0005601977 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 30 03:24:30 np0005601977 kernel: node 0 deferred pages initialised in 9ms
Jan 30 03:24:30 np0005601977 kernel: Memory: 7763912K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618408K reserved, 0K cma-reserved)
Jan 30 03:24:30 np0005601977 kernel: devtmpfs: initialized
Jan 30 03:24:30 np0005601977 kernel: x86/mm: Memory block size: 128MB
Jan 30 03:24:30 np0005601977 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 30 03:24:30 np0005601977 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 30 03:24:30 np0005601977 kernel: pinctrl core: initialized pinctrl subsystem
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 30 03:24:30 np0005601977 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 30 03:24:30 np0005601977 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 30 03:24:30 np0005601977 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 30 03:24:30 np0005601977 kernel: audit: initializing netlink subsys (disabled)
Jan 30 03:24:30 np0005601977 kernel: audit: type=2000 audit(1769761469.028:1): state=initialized audit_enabled=0 res=1
Jan 30 03:24:30 np0005601977 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 30 03:24:30 np0005601977 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 30 03:24:30 np0005601977 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 30 03:24:30 np0005601977 kernel: cpuidle: using governor menu
Jan 30 03:24:30 np0005601977 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 30 03:24:30 np0005601977 kernel: PCI: Using configuration type 1 for base access
Jan 30 03:24:30 np0005601977 kernel: PCI: Using configuration type 1 for extended access
Jan 30 03:24:30 np0005601977 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 30 03:24:30 np0005601977 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 30 03:24:30 np0005601977 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 30 03:24:30 np0005601977 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 30 03:24:30 np0005601977 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 30 03:24:30 np0005601977 kernel: Demotion targets for Node 0: null
Jan 30 03:24:30 np0005601977 kernel: cryptd: max_cpu_qlen set to 1000
Jan 30 03:24:30 np0005601977 kernel: ACPI: Added _OSI(Module Device)
Jan 30 03:24:30 np0005601977 kernel: ACPI: Added _OSI(Processor Device)
Jan 30 03:24:30 np0005601977 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 30 03:24:30 np0005601977 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 30 03:24:30 np0005601977 kernel: ACPI: Interpreter enabled
Jan 30 03:24:30 np0005601977 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 30 03:24:30 np0005601977 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 30 03:24:30 np0005601977 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 30 03:24:30 np0005601977 kernel: PCI: Using E820 reservations for host bridge windows
Jan 30 03:24:30 np0005601977 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 30 03:24:30 np0005601977 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [3] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [4] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [5] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [6] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [7] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [8] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [9] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [10] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [11] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [12] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [13] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [14] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [15] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [16] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [17] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [18] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [19] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [20] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [21] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [22] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [23] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [24] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [25] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [26] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [27] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [28] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [29] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [30] registered
Jan 30 03:24:30 np0005601977 kernel: acpiphp: Slot [31] registered
Jan 30 03:24:30 np0005601977 kernel: PCI host bridge to bus 0000:00
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 30 03:24:30 np0005601977 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 30 03:24:30 np0005601977 kernel: iommu: Default domain type: Translated
Jan 30 03:24:30 np0005601977 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 30 03:24:30 np0005601977 kernel: SCSI subsystem initialized
Jan 30 03:24:30 np0005601977 kernel: ACPI: bus type USB registered
Jan 30 03:24:30 np0005601977 kernel: usbcore: registered new interface driver usbfs
Jan 30 03:24:30 np0005601977 kernel: usbcore: registered new interface driver hub
Jan 30 03:24:30 np0005601977 kernel: usbcore: registered new device driver usb
Jan 30 03:24:30 np0005601977 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 30 03:24:30 np0005601977 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 30 03:24:30 np0005601977 kernel: PTP clock support registered
Jan 30 03:24:30 np0005601977 kernel: EDAC MC: Ver: 3.0.0
Jan 30 03:24:30 np0005601977 kernel: NetLabel: Initializing
Jan 30 03:24:30 np0005601977 kernel: NetLabel:  domain hash size = 128
Jan 30 03:24:30 np0005601977 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 30 03:24:30 np0005601977 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 30 03:24:30 np0005601977 kernel: PCI: Using ACPI for IRQ routing
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 30 03:24:30 np0005601977 kernel: vgaarb: loaded
Jan 30 03:24:30 np0005601977 kernel: clocksource: Switched to clocksource kvm-clock
Jan 30 03:24:30 np0005601977 kernel: VFS: Disk quotas dquot_6.6.0
Jan 30 03:24:30 np0005601977 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 30 03:24:30 np0005601977 kernel: pnp: PnP ACPI init
Jan 30 03:24:30 np0005601977 kernel: pnp: PnP ACPI: found 5 devices
Jan 30 03:24:30 np0005601977 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_INET protocol family
Jan 30 03:24:30 np0005601977 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 30 03:24:30 np0005601977 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_XDP protocol family
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 30 03:24:30 np0005601977 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 30 03:24:30 np0005601977 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 30 03:24:30 np0005601977 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 36883 usecs
Jan 30 03:24:30 np0005601977 kernel: PCI: CLS 0 bytes, default 64
Jan 30 03:24:30 np0005601977 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 30 03:24:30 np0005601977 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 30 03:24:30 np0005601977 kernel: ACPI: bus type thunderbolt registered
Jan 30 03:24:30 np0005601977 kernel: Trying to unpack rootfs image as initramfs...
Jan 30 03:24:30 np0005601977 kernel: Initialise system trusted keyrings
Jan 30 03:24:30 np0005601977 kernel: Key type blacklist registered
Jan 30 03:24:30 np0005601977 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 30 03:24:30 np0005601977 kernel: zbud: loaded
Jan 30 03:24:30 np0005601977 kernel: integrity: Platform Keyring initialized
Jan 30 03:24:30 np0005601977 kernel: integrity: Machine keyring initialized
Jan 30 03:24:30 np0005601977 kernel: Freeing initrd memory: 88000K
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_ALG protocol family
Jan 30 03:24:30 np0005601977 kernel: xor: automatically using best checksumming function   avx       
Jan 30 03:24:30 np0005601977 kernel: Key type asymmetric registered
Jan 30 03:24:30 np0005601977 kernel: Asymmetric key parser 'x509' registered
Jan 30 03:24:30 np0005601977 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 30 03:24:30 np0005601977 kernel: io scheduler mq-deadline registered
Jan 30 03:24:30 np0005601977 kernel: io scheduler kyber registered
Jan 30 03:24:30 np0005601977 kernel: io scheduler bfq registered
Jan 30 03:24:30 np0005601977 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 30 03:24:30 np0005601977 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 30 03:24:30 np0005601977 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 30 03:24:30 np0005601977 kernel: ACPI: button: Power Button [PWRF]
Jan 30 03:24:30 np0005601977 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 30 03:24:30 np0005601977 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 30 03:24:30 np0005601977 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 30 03:24:30 np0005601977 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 30 03:24:30 np0005601977 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 30 03:24:30 np0005601977 kernel: Non-volatile memory driver v1.3
Jan 30 03:24:30 np0005601977 kernel: rdac: device handler registered
Jan 30 03:24:30 np0005601977 kernel: hp_sw: device handler registered
Jan 30 03:24:30 np0005601977 kernel: emc: device handler registered
Jan 30 03:24:30 np0005601977 kernel: alua: device handler registered
Jan 30 03:24:30 np0005601977 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 30 03:24:30 np0005601977 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 30 03:24:30 np0005601977 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 30 03:24:30 np0005601977 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 30 03:24:30 np0005601977 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 30 03:24:30 np0005601977 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 30 03:24:30 np0005601977 kernel: usb usb1: Product: UHCI Host Controller
Jan 30 03:24:30 np0005601977 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 30 03:24:30 np0005601977 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 30 03:24:30 np0005601977 kernel: hub 1-0:1.0: USB hub found
Jan 30 03:24:30 np0005601977 kernel: hub 1-0:1.0: 2 ports detected
Jan 30 03:24:30 np0005601977 kernel: usbcore: registered new interface driver usbserial_generic
Jan 30 03:24:30 np0005601977 kernel: usbserial: USB Serial support registered for generic
Jan 30 03:24:30 np0005601977 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 30 03:24:30 np0005601977 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 30 03:24:30 np0005601977 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 30 03:24:30 np0005601977 kernel: mousedev: PS/2 mouse device common for all mice
Jan 30 03:24:30 np0005601977 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 30 03:24:30 np0005601977 kernel: rtc_cmos 00:04: registered as rtc0
Jan 30 03:24:30 np0005601977 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 30 03:24:30 np0005601977 kernel: rtc_cmos 00:04: setting system clock to 2026-01-30T08:24:29 UTC (1769761469)
Jan 30 03:24:30 np0005601977 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 30 03:24:30 np0005601977 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 30 03:24:30 np0005601977 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 30 03:24:30 np0005601977 kernel: usbcore: registered new interface driver usbhid
Jan 30 03:24:30 np0005601977 kernel: usbhid: USB HID core driver
Jan 30 03:24:30 np0005601977 kernel: drop_monitor: Initializing network drop monitor service
Jan 30 03:24:30 np0005601977 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 30 03:24:30 np0005601977 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 30 03:24:30 np0005601977 kernel: Initializing XFRM netlink socket
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_INET6 protocol family
Jan 30 03:24:30 np0005601977 kernel: Segment Routing with IPv6
Jan 30 03:24:30 np0005601977 kernel: NET: Registered PF_PACKET protocol family
Jan 30 03:24:30 np0005601977 kernel: mpls_gso: MPLS GSO support
Jan 30 03:24:30 np0005601977 kernel: IPI shorthand broadcast: enabled
Jan 30 03:24:30 np0005601977 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 30 03:24:30 np0005601977 kernel: AES CTR mode by8 optimization enabled
Jan 30 03:24:30 np0005601977 kernel: sched_clock: Marking stable (935004240, 162189530)->(1170914020, -73720250)
Jan 30 03:24:30 np0005601977 kernel: registered taskstats version 1
Jan 30 03:24:30 np0005601977 kernel: Loading compiled-in X.509 certificates
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 30 03:24:30 np0005601977 kernel: Demotion targets for Node 0: null
Jan 30 03:24:30 np0005601977 kernel: page_owner is disabled
Jan 30 03:24:30 np0005601977 kernel: Key type .fscrypt registered
Jan 30 03:24:30 np0005601977 kernel: Key type fscrypt-provisioning registered
Jan 30 03:24:30 np0005601977 kernel: Key type big_key registered
Jan 30 03:24:30 np0005601977 kernel: Key type encrypted registered
Jan 30 03:24:30 np0005601977 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 30 03:24:30 np0005601977 kernel: Loading compiled-in module X.509 certificates
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 30 03:24:30 np0005601977 kernel: ima: Allocated hash algorithm: sha256
Jan 30 03:24:30 np0005601977 kernel: ima: No architecture policies found
Jan 30 03:24:30 np0005601977 kernel: evm: Initialising EVM extended attributes:
Jan 30 03:24:30 np0005601977 kernel: evm: security.selinux
Jan 30 03:24:30 np0005601977 kernel: evm: security.SMACK64 (disabled)
Jan 30 03:24:30 np0005601977 kernel: evm: security.SMACK64EXEC (disabled)
Jan 30 03:24:30 np0005601977 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 30 03:24:30 np0005601977 kernel: evm: security.SMACK64MMAP (disabled)
Jan 30 03:24:30 np0005601977 kernel: evm: security.apparmor (disabled)
Jan 30 03:24:30 np0005601977 kernel: evm: security.ima
Jan 30 03:24:30 np0005601977 kernel: evm: security.capability
Jan 30 03:24:30 np0005601977 kernel: evm: HMAC attrs: 0x1
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 30 03:24:30 np0005601977 kernel: Running certificate verification RSA selftest
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 30 03:24:30 np0005601977 kernel: Running certificate verification ECDSA selftest
Jan 30 03:24:30 np0005601977 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 30 03:24:30 np0005601977 kernel: clk: Disabling unused clocks
Jan 30 03:24:30 np0005601977 kernel: Freeing unused decrypted memory: 2028K
Jan 30 03:24:30 np0005601977 kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 30 03:24:30 np0005601977 kernel: Write protecting the kernel read-only data: 30720k
Jan 30 03:24:30 np0005601977 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 30 03:24:30 np0005601977 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 30 03:24:30 np0005601977 kernel: Run /init as init process
Jan 30 03:24:30 np0005601977 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 30 03:24:30 np0005601977 systemd: Detected virtualization kvm.
Jan 30 03:24:30 np0005601977 systemd: Detected architecture x86-64.
Jan 30 03:24:30 np0005601977 systemd: Running in initrd.
Jan 30 03:24:30 np0005601977 systemd: No hostname configured, using default hostname.
Jan 30 03:24:30 np0005601977 systemd: Hostname set to <localhost>.
Jan 30 03:24:30 np0005601977 systemd: Initializing machine ID from VM UUID.
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: Manufacturer: QEMU
Jan 30 03:24:30 np0005601977 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 30 03:24:30 np0005601977 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 30 03:24:30 np0005601977 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 30 03:24:30 np0005601977 systemd: Queued start job for default target Initrd Default Target.
Jan 30 03:24:30 np0005601977 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:30 np0005601977 systemd: Reached target Local Encrypted Volumes.
Jan 30 03:24:30 np0005601977 systemd: Reached target Initrd /usr File System.
Jan 30 03:24:30 np0005601977 systemd: Reached target Local File Systems.
Jan 30 03:24:30 np0005601977 systemd: Reached target Path Units.
Jan 30 03:24:30 np0005601977 systemd: Reached target Slice Units.
Jan 30 03:24:30 np0005601977 systemd: Reached target Swaps.
Jan 30 03:24:30 np0005601977 systemd: Reached target Timer Units.
Jan 30 03:24:30 np0005601977 systemd: Listening on D-Bus System Message Bus Socket.
Jan 30 03:24:30 np0005601977 systemd: Listening on Journal Socket (/dev/log).
Jan 30 03:24:30 np0005601977 systemd: Listening on Journal Socket.
Jan 30 03:24:30 np0005601977 systemd: Listening on udev Control Socket.
Jan 30 03:24:30 np0005601977 systemd: Listening on udev Kernel Socket.
Jan 30 03:24:30 np0005601977 systemd: Reached target Socket Units.
Jan 30 03:24:30 np0005601977 systemd: Starting Create List of Static Device Nodes...
Jan 30 03:24:30 np0005601977 systemd: Starting Journal Service...
Jan 30 03:24:30 np0005601977 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 30 03:24:30 np0005601977 systemd: Starting Apply Kernel Variables...
Jan 30 03:24:30 np0005601977 systemd: Starting Create System Users...
Jan 30 03:24:30 np0005601977 systemd: Starting Setup Virtual Console...
Jan 30 03:24:30 np0005601977 systemd: Finished Create List of Static Device Nodes.
Jan 30 03:24:30 np0005601977 systemd-journald[301]: Journal started
Jan 30 03:24:30 np0005601977 systemd-journald[301]: Runtime Journal (/run/log/journal/84994b481455435fa6fe1797df140bfa) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:30 np0005601977 systemd: Started Journal Service.
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Apply Kernel Variables.
Jan 30 03:24:30 np0005601977 systemd-sysusers[305]: Creating group 'users' with GID 100.
Jan 30 03:24:30 np0005601977 systemd-sysusers[305]: Creating group 'dbus' with GID 81.
Jan 30 03:24:30 np0005601977 systemd-sysusers[305]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Create System Users.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 30 03:24:30 np0005601977 systemd[1]: Starting Create Volatile Files and Directories...
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Create Volatile Files and Directories.
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Setup Virtual Console.
Jan 30 03:24:30 np0005601977 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting dracut cmdline hook...
Jan 30 03:24:30 np0005601977 dracut-cmdline[321]: dracut-9 dracut-057-102.git20250818.el9
Jan 30 03:24:30 np0005601977 dracut-cmdline[321]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:30 np0005601977 systemd[1]: Finished dracut cmdline hook.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting dracut pre-udev hook...
Jan 30 03:24:30 np0005601977 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 30 03:24:30 np0005601977 kernel: device-mapper: uevent: version 1.0.3
Jan 30 03:24:30 np0005601977 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 30 03:24:30 np0005601977 kernel: RPC: Registered named UNIX socket transport module.
Jan 30 03:24:30 np0005601977 kernel: RPC: Registered udp transport module.
Jan 30 03:24:30 np0005601977 kernel: RPC: Registered tcp transport module.
Jan 30 03:24:30 np0005601977 kernel: RPC: Registered tcp-with-tls transport module.
Jan 30 03:24:30 np0005601977 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 30 03:24:30 np0005601977 rpc.statd[439]: Version 2.5.4 starting
Jan 30 03:24:30 np0005601977 rpc.statd[439]: Initializing NSM state
Jan 30 03:24:30 np0005601977 rpc.idmapd[444]: Setting log level to 0
Jan 30 03:24:30 np0005601977 systemd[1]: Finished dracut pre-udev hook.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 30 03:24:30 np0005601977 systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Jan 30 03:24:30 np0005601977 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting dracut pre-trigger hook...
Jan 30 03:24:30 np0005601977 systemd[1]: Finished dracut pre-trigger hook.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting Coldplug All udev Devices...
Jan 30 03:24:30 np0005601977 systemd[1]: Created slice Slice /system/modprobe.
Jan 30 03:24:30 np0005601977 systemd[1]: Starting Load Kernel Module configfs...
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Coldplug All udev Devices.
Jan 30 03:24:30 np0005601977 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:24:30 np0005601977 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:24:30 np0005601977 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 30 03:24:30 np0005601977 systemd[1]: Reached target Network.
Jan 30 03:24:30 np0005601977 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 30 03:24:30 np0005601977 systemd[1]: Starting dracut initqueue hook...
Jan 30 03:24:30 np0005601977 systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:24:30 np0005601977 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 30 03:24:30 np0005601977 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 30 03:24:30 np0005601977 kernel: vda: vda1
Jan 30 03:24:30 np0005601977 kernel: scsi host0: ata_piix
Jan 30 03:24:30 np0005601977 kernel: scsi host1: ata_piix
Jan 30 03:24:30 np0005601977 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 30 03:24:30 np0005601977 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 30 03:24:30 np0005601977 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 30 03:24:30 np0005601977 systemd[1]: Reached target Initrd Root Device.
Jan 30 03:24:31 np0005601977 systemd[1]: Mounting Kernel Configuration File System...
Jan 30 03:24:31 np0005601977 systemd[1]: Mounted Kernel Configuration File System.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target System Initialization.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Basic System.
Jan 30 03:24:31 np0005601977 kernel: ata1: found unknown device (class 0)
Jan 30 03:24:31 np0005601977 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 30 03:24:31 np0005601977 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 30 03:24:31 np0005601977 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 30 03:24:31 np0005601977 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 30 03:24:31 np0005601977 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 30 03:24:31 np0005601977 systemd[1]: Finished dracut initqueue hook.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Remote File Systems.
Jan 30 03:24:31 np0005601977 systemd[1]: Starting dracut pre-mount hook...
Jan 30 03:24:31 np0005601977 systemd[1]: Finished dracut pre-mount hook.
Jan 30 03:24:31 np0005601977 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 30 03:24:31 np0005601977 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Jan 30 03:24:31 np0005601977 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 30 03:24:31 np0005601977 systemd[1]: Mounting /sysroot...
Jan 30 03:24:31 np0005601977 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 30 03:24:31 np0005601977 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 30 03:24:31 np0005601977 kernel: XFS (vda1): Ending clean mount
Jan 30 03:24:31 np0005601977 systemd[1]: Mounted /sysroot.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Initrd Root File System.
Jan 30 03:24:31 np0005601977 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 30 03:24:31 np0005601977 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 30 03:24:31 np0005601977 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Initrd File Systems.
Jan 30 03:24:31 np0005601977 systemd[1]: Reached target Initrd Default Target.
Jan 30 03:24:31 np0005601977 systemd[1]: Starting dracut mount hook...
Jan 30 03:24:31 np0005601977 systemd[1]: Finished dracut mount hook.
Jan 30 03:24:31 np0005601977 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 30 03:24:32 np0005601977 rpc.idmapd[444]: exiting on signal 15
Jan 30 03:24:32 np0005601977 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 30 03:24:32 np0005601977 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Network.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Timer Units.
Jan 30 03:24:32 np0005601977 systemd[1]: dbus.socket: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Initrd Default Target.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Basic System.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Initrd Root Device.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Initrd /usr File System.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Path Units.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Remote File Systems.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Slice Units.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Socket Units.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target System Initialization.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Local File Systems.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Swaps.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut mount hook.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut pre-mount hook.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut initqueue hook.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Apply Kernel Variables.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Coldplug All udev Devices.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut pre-trigger hook.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Setup Virtual Console.
Jan 30 03:24:32 np0005601977 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Closed udev Control Socket.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Closed udev Kernel Socket.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut pre-udev hook.
Jan 30 03:24:32 np0005601977 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped dracut cmdline hook.
Jan 30 03:24:32 np0005601977 systemd[1]: Starting Cleanup udev Database...
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 30 03:24:32 np0005601977 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 30 03:24:32 np0005601977 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Stopped Create System Users.
Jan 30 03:24:32 np0005601977 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 30 03:24:32 np0005601977 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 30 03:24:32 np0005601977 systemd[1]: Finished Cleanup udev Database.
Jan 30 03:24:32 np0005601977 systemd[1]: Reached target Switch Root.
Jan 30 03:24:32 np0005601977 systemd[1]: Starting Switch Root...
Jan 30 03:24:32 np0005601977 systemd[1]: Switching root.
Jan 30 03:24:32 np0005601977 systemd-journald[301]: Journal stopped
Jan 30 03:24:33 np0005601977 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 30 03:24:33 np0005601977 kernel: audit: type=1404 audit(1769761472.513:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:24:33 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:24:33 np0005601977 kernel: audit: type=1403 audit(1769761472.626:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 30 03:24:33 np0005601977 systemd: Successfully loaded SELinux policy in 119.387ms.
Jan 30 03:24:33 np0005601977 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 41.798ms.
Jan 30 03:24:33 np0005601977 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 30 03:24:33 np0005601977 systemd: Detected virtualization kvm.
Jan 30 03:24:33 np0005601977 systemd: Detected architecture x86-64.
Jan 30 03:24:33 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:24:33 np0005601977 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd: Stopped Switch Root.
Jan 30 03:24:33 np0005601977 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 30 03:24:33 np0005601977 systemd: Created slice Slice /system/getty.
Jan 30 03:24:33 np0005601977 systemd: Created slice Slice /system/serial-getty.
Jan 30 03:24:33 np0005601977 systemd: Created slice Slice /system/sshd-keygen.
Jan 30 03:24:33 np0005601977 systemd: Created slice User and Session Slice.
Jan 30 03:24:33 np0005601977 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:33 np0005601977 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 30 03:24:33 np0005601977 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 30 03:24:33 np0005601977 systemd: Reached target Local Encrypted Volumes.
Jan 30 03:24:33 np0005601977 systemd: Stopped target Switch Root.
Jan 30 03:24:33 np0005601977 systemd: Stopped target Initrd File Systems.
Jan 30 03:24:33 np0005601977 systemd: Stopped target Initrd Root File System.
Jan 30 03:24:33 np0005601977 systemd: Reached target Local Integrity Protected Volumes.
Jan 30 03:24:33 np0005601977 systemd: Reached target Path Units.
Jan 30 03:24:33 np0005601977 systemd: Reached target rpc_pipefs.target.
Jan 30 03:24:33 np0005601977 systemd: Reached target Slice Units.
Jan 30 03:24:33 np0005601977 systemd: Reached target Swaps.
Jan 30 03:24:33 np0005601977 systemd: Reached target Local Verity Protected Volumes.
Jan 30 03:24:33 np0005601977 systemd: Listening on RPCbind Server Activation Socket.
Jan 30 03:24:33 np0005601977 systemd: Reached target RPC Port Mapper.
Jan 30 03:24:33 np0005601977 systemd: Listening on Process Core Dump Socket.
Jan 30 03:24:33 np0005601977 systemd: Listening on initctl Compatibility Named Pipe.
Jan 30 03:24:33 np0005601977 systemd: Listening on udev Control Socket.
Jan 30 03:24:33 np0005601977 systemd: Listening on udev Kernel Socket.
Jan 30 03:24:33 np0005601977 systemd: Mounting Huge Pages File System...
Jan 30 03:24:33 np0005601977 systemd: Mounting POSIX Message Queue File System...
Jan 30 03:24:33 np0005601977 systemd: Mounting Kernel Debug File System...
Jan 30 03:24:33 np0005601977 systemd: Mounting Kernel Trace File System...
Jan 30 03:24:33 np0005601977 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 30 03:24:33 np0005601977 systemd: Starting Create List of Static Device Nodes...
Jan 30 03:24:33 np0005601977 systemd: Starting Load Kernel Module configfs...
Jan 30 03:24:33 np0005601977 systemd: Starting Load Kernel Module drm...
Jan 30 03:24:33 np0005601977 systemd: Starting Load Kernel Module efi_pstore...
Jan 30 03:24:33 np0005601977 systemd: Starting Load Kernel Module fuse...
Jan 30 03:24:33 np0005601977 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 30 03:24:33 np0005601977 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd: Stopped File System Check on Root Device.
Jan 30 03:24:33 np0005601977 systemd: Stopped Journal Service.
Jan 30 03:24:33 np0005601977 systemd: Starting Journal Service...
Jan 30 03:24:33 np0005601977 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 30 03:24:33 np0005601977 systemd: Starting Generate network units from Kernel command line...
Jan 30 03:24:33 np0005601977 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:24:33 np0005601977 systemd: Starting Remount Root and Kernel File Systems...
Jan 30 03:24:33 np0005601977 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 30 03:24:33 np0005601977 systemd: Starting Apply Kernel Variables...
Jan 30 03:24:33 np0005601977 systemd: Starting Coldplug All udev Devices...
Jan 30 03:24:33 np0005601977 systemd-journald[675]: Journal started
Jan 30 03:24:33 np0005601977 systemd-journald[675]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:33 np0005601977 systemd[1]: Queued start job for default target Multi-User System.
Jan 30 03:24:33 np0005601977 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd: Started Journal Service.
Jan 30 03:24:33 np0005601977 systemd[1]: Mounted Huge Pages File System.
Jan 30 03:24:33 np0005601977 systemd[1]: Mounted POSIX Message Queue File System.
Jan 30 03:24:33 np0005601977 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 30 03:24:33 np0005601977 systemd[1]: Mounted Kernel Debug File System.
Jan 30 03:24:33 np0005601977 systemd[1]: Mounted Kernel Trace File System.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Create List of Static Device Nodes.
Jan 30 03:24:33 np0005601977 kernel: ACPI: bus type drm_connector registered
Jan 30 03:24:33 np0005601977 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:24:33 np0005601977 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Load Kernel Module drm.
Jan 30 03:24:33 np0005601977 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 30 03:24:33 np0005601977 kernel: fuse: init (API version 7.37)
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 30 03:24:33 np0005601977 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Load Kernel Module fuse.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Generate network units from Kernel command line.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Apply Kernel Variables.
Jan 30 03:24:33 np0005601977 systemd[1]: Mounting FUSE Control File System...
Jan 30 03:24:33 np0005601977 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Rebuild Hardware Database...
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 30 03:24:33 np0005601977 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Load/Save OS Random Seed...
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Create System Users...
Jan 30 03:24:33 np0005601977 systemd[1]: Mounted FUSE Control File System.
Jan 30 03:24:33 np0005601977 systemd-journald[675]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:33 np0005601977 systemd-journald[675]: Received client request to flush runtime journal.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Coldplug All udev Devices.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Load/Save OS Random Seed.
Jan 30 03:24:33 np0005601977 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Create System Users.
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 30 03:24:33 np0005601977 systemd[1]: Reached target Preparation for Local File Systems.
Jan 30 03:24:33 np0005601977 systemd[1]: Reached target Local File Systems.
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 30 03:24:33 np0005601977 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 30 03:24:33 np0005601977 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 30 03:24:33 np0005601977 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Automatic Boot Loader Update...
Jan 30 03:24:33 np0005601977 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Create Volatile Files and Directories...
Jan 30 03:24:33 np0005601977 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Automatic Boot Loader Update.
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Create Volatile Files and Directories.
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Security Auditing Service...
Jan 30 03:24:33 np0005601977 systemd[1]: Starting RPC Bind...
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Rebuild Journal Catalog...
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Rebuild Journal Catalog.
Jan 30 03:24:33 np0005601977 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 30 03:24:33 np0005601977 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 30 03:24:33 np0005601977 systemd[1]: Started RPC Bind.
Jan 30 03:24:33 np0005601977 augenrules[706]: /sbin/augenrules: No change
Jan 30 03:24:33 np0005601977 augenrules[721]: No rules
Jan 30 03:24:33 np0005601977 augenrules[721]: enabled 1
Jan 30 03:24:33 np0005601977 augenrules[721]: failure 1
Jan 30 03:24:33 np0005601977 augenrules[721]: pid 701
Jan 30 03:24:33 np0005601977 augenrules[721]: rate_limit 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_limit 8192
Jan 30 03:24:33 np0005601977 augenrules[721]: lost 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:33 np0005601977 augenrules[721]: enabled 1
Jan 30 03:24:33 np0005601977 augenrules[721]: failure 1
Jan 30 03:24:33 np0005601977 augenrules[721]: pid 701
Jan 30 03:24:33 np0005601977 augenrules[721]: rate_limit 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_limit 8192
Jan 30 03:24:33 np0005601977 augenrules[721]: lost 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog 3
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:33 np0005601977 augenrules[721]: enabled 1
Jan 30 03:24:33 np0005601977 augenrules[721]: failure 1
Jan 30 03:24:33 np0005601977 augenrules[721]: pid 701
Jan 30 03:24:33 np0005601977 augenrules[721]: rate_limit 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_limit 8192
Jan 30 03:24:33 np0005601977 augenrules[721]: lost 0
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog 3
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:33 np0005601977 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:33 np0005601977 systemd[1]: Started Security Auditing Service.
Jan 30 03:24:33 np0005601977 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 30 03:24:33 np0005601977 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 30 03:24:34 np0005601977 systemd[1]: Finished Rebuild Hardware Database.
Jan 30 03:24:34 np0005601977 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 30 03:24:34 np0005601977 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 30 03:24:34 np0005601977 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 30 03:24:34 np0005601977 systemd[1]: Starting Load Kernel Module configfs...
Jan 30 03:24:34 np0005601977 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 30 03:24:34 np0005601977 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:24:34 np0005601977 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:24:34 np0005601977 systemd-udevd[733]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:24:34 np0005601977 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 30 03:24:34 np0005601977 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 30 03:24:34 np0005601977 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 30 03:24:34 np0005601977 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 30 03:24:34 np0005601977 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 30 03:24:34 np0005601977 systemd[1]: Starting Update is Completed...
Jan 30 03:24:34 np0005601977 systemd[1]: Finished Update is Completed.
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target System Initialization.
Jan 30 03:24:34 np0005601977 systemd[1]: Started dnf makecache --timer.
Jan 30 03:24:34 np0005601977 systemd[1]: Started Daily rotation of log files.
Jan 30 03:24:34 np0005601977 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target Timer Units.
Jan 30 03:24:34 np0005601977 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 30 03:24:34 np0005601977 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target Socket Units.
Jan 30 03:24:34 np0005601977 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 30 03:24:34 np0005601977 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 30 03:24:34 np0005601977 kernel: Console: switching to colour dummy device 80x25
Jan 30 03:24:34 np0005601977 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 30 03:24:34 np0005601977 kernel: [drm] features: -context_init
Jan 30 03:24:34 np0005601977 kernel: [drm] number of scanouts: 1
Jan 30 03:24:34 np0005601977 kernel: [drm] number of cap sets: 0
Jan 30 03:24:34 np0005601977 systemd[1]: Starting D-Bus System Message Bus...
Jan 30 03:24:34 np0005601977 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 30 03:24:34 np0005601977 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 30 03:24:34 np0005601977 kernel: Console: switching to colour frame buffer device 128x48
Jan 30 03:24:34 np0005601977 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:24:34 np0005601977 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 30 03:24:34 np0005601977 kernel: kvm_amd: TSC scaling supported
Jan 30 03:24:34 np0005601977 kernel: kvm_amd: Nested Virtualization enabled
Jan 30 03:24:34 np0005601977 kernel: kvm_amd: Nested Paging enabled
Jan 30 03:24:34 np0005601977 kernel: kvm_amd: LBR virtualization supported
Jan 30 03:24:34 np0005601977 systemd[1]: Started D-Bus System Message Bus.
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target Basic System.
Jan 30 03:24:34 np0005601977 dbus-broker-lau[783]: Ready
Jan 30 03:24:34 np0005601977 systemd[1]: Starting NTP client/server...
Jan 30 03:24:34 np0005601977 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 30 03:24:34 np0005601977 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 30 03:24:34 np0005601977 systemd[1]: Starting IPv4 firewall with iptables...
Jan 30 03:24:34 np0005601977 systemd[1]: Started irqbalance daemon.
Jan 30 03:24:34 np0005601977 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 30 03:24:34 np0005601977 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:24:34 np0005601977 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:24:34 np0005601977 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target sshd-keygen.target.
Jan 30 03:24:34 np0005601977 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 30 03:24:34 np0005601977 systemd[1]: Reached target User and Group Name Lookups.
Jan 30 03:24:34 np0005601977 systemd[1]: Starting User Login Management...
Jan 30 03:24:34 np0005601977 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 30 03:24:34 np0005601977 chronyd[829]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 30 03:24:34 np0005601977 systemd-logind[809]: New seat seat0.
Jan 30 03:24:34 np0005601977 systemd-logind[809]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 30 03:24:34 np0005601977 systemd-logind[809]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 30 03:24:34 np0005601977 chronyd[829]: Loaded 0 symmetric keys
Jan 30 03:24:34 np0005601977 systemd[1]: Started User Login Management.
Jan 30 03:24:34 np0005601977 chronyd[829]: Using right/UTC timezone to obtain leap second data
Jan 30 03:24:34 np0005601977 chronyd[829]: Loaded seccomp filter (level 2)
Jan 30 03:24:34 np0005601977 systemd[1]: Started NTP client/server.
Jan 30 03:24:34 np0005601977 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 30 03:24:34 np0005601977 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 30 03:24:34 np0005601977 iptables.init[795]: iptables: Applying firewall rules: [  OK  ]
Jan 30 03:24:34 np0005601977 systemd[1]: Finished IPv4 firewall with iptables.
Jan 30 03:24:35 np0005601977 cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 30 Jan 2026 08:24:35 +0000. Up 6.86 seconds.
Jan 30 03:24:35 np0005601977 systemd[1]: run-cloud\x2dinit-tmp-tmpuksjc3ba.mount: Deactivated successfully.
Jan 30 03:24:35 np0005601977 systemd[1]: Starting Hostname Service...
Jan 30 03:24:35 np0005601977 systemd[1]: Started Hostname Service.
Jan 30 03:24:35 np0005601977 systemd-hostnamed[852]: Hostname set to <np0005601977.novalocal> (static)
Jan 30 03:24:35 np0005601977 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 30 03:24:35 np0005601977 systemd[1]: Reached target Preparation for Network.
Jan 30 03:24:35 np0005601977 systemd[1]: Starting Network Manager...
Jan 30 03:24:35 np0005601977 NetworkManager[856]: <info>  [1769761475.9944] NetworkManager (version 1.54.3-2.el9) is starting... (boot:c4148084-b675-426a-931e-95e26d0c5cd7)
Jan 30 03:24:35 np0005601977 NetworkManager[856]: <info>  [1769761475.9949] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0107] manager[0x55a92642a000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0163] hostname: hostname: using hostnamed
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0163] hostname: static hostname changed from (none) to "np0005601977.novalocal"
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0168] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0315] manager[0x55a92642a000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0316] manager[0x55a92642a000]: rfkill: WWAN hardware radio set enabled
Jan 30 03:24:36 np0005601977 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0410] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0410] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0411] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0412] manager: Networking is enabled by state file
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0415] settings: Loaded settings plugin: keyfile (internal)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0458] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0491] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0512] dhcp: init: Using DHCP client 'internal'
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0517] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0536] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0548] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0564] device (lo): Activation: starting connection 'lo' (c19eaec1-d83a-4993-82a8-76b7a83473d3)
Jan 30 03:24:36 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0576] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0581] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 03:24:36 np0005601977 systemd[1]: Started Network Manager.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0620] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0625] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0628] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0631] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0634] device (eth0): carrier: link connected
Jan 30 03:24:36 np0005601977 systemd[1]: Reached target Network.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0637] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0648] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0658] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0665] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0666] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0668] manager: NetworkManager state is now CONNECTING
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0670] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 systemd[1]: Starting Network Manager Wait Online...
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0681] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0686] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:24:36 np0005601977 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 30 03:24:36 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0800] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0802] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.0808] device (lo): Activation: successful, device activated.
Jan 30 03:24:36 np0005601977 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 30 03:24:36 np0005601977 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 30 03:24:36 np0005601977 systemd[1]: Reached target NFS client services.
Jan 30 03:24:36 np0005601977 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 30 03:24:36 np0005601977 systemd[1]: Reached target Remote File Systems.
Jan 30 03:24:36 np0005601977 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4814] dhcp4 (eth0): state changed new lease, address=38.102.83.194
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4835] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4873] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4909] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4917] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4927] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4942] device (eth0): Activation: successful, device activated.
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4950] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 03:24:36 np0005601977 NetworkManager[856]: <info>  [1769761476.4962] manager: startup complete
Jan 30 03:24:36 np0005601977 systemd[1]: Finished Network Manager Wait Online.
Jan 30 03:24:36 np0005601977 systemd[1]: Starting Cloud-init: Network Stage...
Jan 30 03:24:36 np0005601977 cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 30 Jan 2026 08:24:36 +0000. Up 8.16 seconds.
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.194         | 255.255.255.0 | global | fa:16:3e:ab:e7:82 |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:feab:e782/64 |       .       |  link  | fa:16:3e:ab:e7:82 |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 30 03:24:36 np0005601977 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:24:38 np0005601977 cloud-init[920]: Generating public/private rsa key pair.
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key fingerprint is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: SHA256:ip/WKUtutdD1lyBwWlIQqGfa9BfDDuezdACB0+T1uOw root@np0005601977.novalocal
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key's randomart image is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: +---[RSA 3072]----+
Jan 30 03:24:38 np0005601977 cloud-init[920]: |       ===o      |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |      +.=.oo     |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |     . ..O. .    |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |    . + o.O..    |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |     * oS*o* . . |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |    ..o.+.* o o  |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |    . o+ =E+ .   |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |     ++.+ .      |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |     o=o         |
Jan 30 03:24:38 np0005601977 cloud-init[920]: +----[SHA256]-----+
Jan 30 03:24:38 np0005601977 cloud-init[920]: Generating public/private ecdsa key pair.
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key fingerprint is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: SHA256:37kfCHZVyoZ/tVHzjO9fjF+FgGU29XLdZomnJfWzdpk root@np0005601977.novalocal
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key's randomart image is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: +---[ECDSA 256]---+
Jan 30 03:24:38 np0005601977 cloud-init[920]: |            =..oo|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |           = +o*O|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |          . ooB*%|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |             ==BO|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |        S o ..oE=|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |         o + o.=o|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |          . + o =|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |             . o+|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |            ... o|
Jan 30 03:24:38 np0005601977 cloud-init[920]: +----[SHA256]-----+
Jan 30 03:24:38 np0005601977 cloud-init[920]: Generating public/private ed25519 key pair.
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 30 03:24:38 np0005601977 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key fingerprint is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: SHA256:TNtJz+l+zqrbVBPWKtpw/0j3SzT4va/t34qzh/mfoLY root@np0005601977.novalocal
Jan 30 03:24:38 np0005601977 cloud-init[920]: The key's randomart image is:
Jan 30 03:24:38 np0005601977 cloud-init[920]: +--[ED25519 256]--+
Jan 30 03:24:38 np0005601977 cloud-init[920]: |                 |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |               . |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |        . .   o .|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |       o + + o.o |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |        S + *.+o |
Jan 30 03:24:38 np0005601977 cloud-init[920]: |           * +o.o|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |          . ++ooo|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |           ==+===|
Jan 30 03:24:38 np0005601977 cloud-init[920]: |          +EBOBO&|
Jan 30 03:24:38 np0005601977 cloud-init[920]: +----[SHA256]-----+
Jan 30 03:24:38 np0005601977 systemd[1]: Finished Cloud-init: Network Stage.
Jan 30 03:24:38 np0005601977 systemd[1]: Reached target Cloud-config availability.
Jan 30 03:24:38 np0005601977 systemd[1]: Reached target Network is Online.
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Cloud-init: Config Stage...
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Crash recovery kernel arming...
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Notify NFS peers of a restart...
Jan 30 03:24:38 np0005601977 systemd[1]: Starting System Logging Service...
Jan 30 03:24:38 np0005601977 systemd[1]: Starting OpenSSH server daemon...
Jan 30 03:24:38 np0005601977 sm-notify[1005]: Version 2.5.4 starting
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Permit User Sessions...
Jan 30 03:24:38 np0005601977 systemd[1]: Started Notify NFS peers of a restart.
Jan 30 03:24:38 np0005601977 systemd[1]: Finished Permit User Sessions.
Jan 30 03:24:38 np0005601977 systemd[1]: Started Command Scheduler.
Jan 30 03:24:38 np0005601977 systemd[1]: Started Getty on tty1.
Jan 30 03:24:38 np0005601977 systemd[1]: Started Serial Getty on ttyS0.
Jan 30 03:24:38 np0005601977 systemd[1]: Reached target Login Prompts.
Jan 30 03:24:38 np0005601977 systemd[1]: Started OpenSSH server daemon.
Jan 30 03:24:38 np0005601977 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 30 03:24:38 np0005601977 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 30 03:24:38 np0005601977 systemd[1]: Started System Logging Service.
Jan 30 03:24:38 np0005601977 systemd[1]: Reached target Multi-User System.
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 30 03:24:38 np0005601977 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 30 03:24:38 np0005601977 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 30 03:24:38 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 03:24:38 np0005601977 kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Jan 30 03:24:38 np0005601977 kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 30 03:24:38 np0005601977 cloud-init[1103]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 30 Jan 2026 08:24:38 +0000. Up 9.83 seconds.
Jan 30 03:24:38 np0005601977 systemd[1]: Finished Cloud-init: Config Stage.
Jan 30 03:24:38 np0005601977 systemd[1]: Starting Cloud-init: Final Stage...
Jan 30 03:24:38 np0005601977 cloud-init[1281]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 30 Jan 2026 08:24:38 +0000. Up 10.21 seconds.
Jan 30 03:24:38 np0005601977 dracut[1287]: dracut-057-102.git20250818.el9
Jan 30 03:24:38 np0005601977 cloud-init[1304]: #############################################################
Jan 30 03:24:38 np0005601977 cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 30 03:24:38 np0005601977 cloud-init[1307]: 256 SHA256:37kfCHZVyoZ/tVHzjO9fjF+FgGU29XLdZomnJfWzdpk root@np0005601977.novalocal (ECDSA)
Jan 30 03:24:38 np0005601977 cloud-init[1309]: 256 SHA256:TNtJz+l+zqrbVBPWKtpw/0j3SzT4va/t34qzh/mfoLY root@np0005601977.novalocal (ED25519)
Jan 30 03:24:38 np0005601977 cloud-init[1311]: 3072 SHA256:ip/WKUtutdD1lyBwWlIQqGfa9BfDDuezdACB0+T1uOw root@np0005601977.novalocal (RSA)
Jan 30 03:24:38 np0005601977 cloud-init[1312]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 30 03:24:38 np0005601977 cloud-init[1313]: #############################################################
Jan 30 03:24:39 np0005601977 cloud-init[1281]: Cloud-init v. 24.4-8.el9 finished at Fri, 30 Jan 2026 08:24:38 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.39 seconds
Jan 30 03:24:39 np0005601977 dracut[1289]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 30 03:24:39 np0005601977 systemd[1]: Finished Cloud-init: Final Stage.
Jan 30 03:24:39 np0005601977 systemd[1]: Reached target Cloud-init target.
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 30 03:24:39 np0005601977 dracut[1289]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: memstrack is not available
Jan 30 03:24:40 np0005601977 dracut[1289]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 30 03:24:40 np0005601977 dracut[1289]: memstrack is not available
Jan 30 03:24:40 np0005601977 dracut[1289]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 30 03:24:40 np0005601977 dracut[1289]: *** Including module: systemd ***
Jan 30 03:24:40 np0005601977 dracut[1289]: *** Including module: fips ***
Jan 30 03:24:40 np0005601977 dracut[1289]: *** Including module: systemd-initrd ***
Jan 30 03:24:40 np0005601977 dracut[1289]: *** Including module: i18n ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: drm ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: prefixdevname ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: kernel-modules ***
Jan 30 03:24:41 np0005601977 chronyd[829]: Selected source 144.217.93.2 (2.centos.pool.ntp.org)
Jan 30 03:24:41 np0005601977 chronyd[829]: System clock TAI offset set to 37 seconds
Jan 30 03:24:41 np0005601977 kernel: block vda: the capability attribute has been deprecated.
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: kernel-modules-extra ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: qemu ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: fstab-sys ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: rootfs-block ***
Jan 30 03:24:41 np0005601977 dracut[1289]: *** Including module: terminfo ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: udev-rules ***
Jan 30 03:24:42 np0005601977 dracut[1289]: Skipping udev rule: 91-permissions.rules
Jan 30 03:24:42 np0005601977 dracut[1289]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: virtiofs ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: dracut-systemd ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: usrmount ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: base ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: fs-lib ***
Jan 30 03:24:42 np0005601977 dracut[1289]: *** Including module: kdumpbase ***
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 30 03:24:43 np0005601977 dracut[1289]:  microcode_ctl module: mangling fw_dir
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 30 03:24:43 np0005601977 dracut[1289]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Including module: openssl ***
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Including module: shutdown ***
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Including module: squash ***
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Including modules done ***
Jan 30 03:24:43 np0005601977 dracut[1289]: *** Installing kernel module dependencies ***
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 35 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 33 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 31 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 28 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 34 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 32 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 30 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 irqbalance[797]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 30 03:24:44 np0005601977 irqbalance[797]: IRQ 29 affinity is now unmanaged
Jan 30 03:24:44 np0005601977 dracut[1289]: *** Installing kernel module dependencies done ***
Jan 30 03:24:44 np0005601977 dracut[1289]: *** Resolving executable dependencies ***
Jan 30 03:24:46 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:24:47 np0005601977 dracut[1289]: *** Resolving executable dependencies done ***
Jan 30 03:24:47 np0005601977 dracut[1289]: *** Generating early-microcode cpio image ***
Jan 30 03:24:47 np0005601977 dracut[1289]: *** Store current command line parameters ***
Jan 30 03:24:47 np0005601977 dracut[1289]: Stored kernel commandline:
Jan 30 03:24:47 np0005601977 dracut[1289]: No dracut internal kernel commandline stored in the initramfs
Jan 30 03:24:47 np0005601977 dracut[1289]: *** Install squash loader ***
Jan 30 03:24:48 np0005601977 dracut[1289]: *** Squashing the files inside the initramfs ***
Jan 30 03:24:49 np0005601977 dracut[1289]: *** Squashing the files inside the initramfs done ***
Jan 30 03:24:49 np0005601977 dracut[1289]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 30 03:24:49 np0005601977 dracut[1289]: *** Hardlinking files ***
Jan 30 03:24:49 np0005601977 dracut[1289]: *** Hardlinking files done ***
Jan 30 03:24:50 np0005601977 dracut[1289]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 30 03:24:51 np0005601977 kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Jan 30 03:24:51 np0005601977 kdumpctl[1015]: kdump: Starting kdump: [OK]
Jan 30 03:24:51 np0005601977 systemd[1]: Finished Crash recovery kernel arming.
Jan 30 03:24:51 np0005601977 systemd[1]: Startup finished in 1.237s (kernel) + 2.653s (initrd) + 18.635s (userspace) = 22.525s.
Jan 30 03:25:06 np0005601977 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:25:17 np0005601977 systemd[1]: Created slice User Slice of UID 1000.
Jan 30 03:25:17 np0005601977 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 30 03:25:17 np0005601977 systemd-logind[809]: New session 1 of user zuul.
Jan 30 03:25:18 np0005601977 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 30 03:25:18 np0005601977 systemd[1]: Starting User Manager for UID 1000...
Jan 30 03:25:18 np0005601977 systemd[4308]: Queued start job for default target Main User Target.
Jan 30 03:25:18 np0005601977 systemd[4308]: Created slice User Application Slice.
Jan 30 03:25:18 np0005601977 systemd[4308]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 30 03:25:18 np0005601977 systemd[4308]: Started Daily Cleanup of User's Temporary Directories.
Jan 30 03:25:18 np0005601977 systemd[4308]: Reached target Paths.
Jan 30 03:25:18 np0005601977 systemd[4308]: Reached target Timers.
Jan 30 03:25:18 np0005601977 systemd[4308]: Starting D-Bus User Message Bus Socket...
Jan 30 03:25:18 np0005601977 systemd[4308]: Starting Create User's Volatile Files and Directories...
Jan 30 03:25:18 np0005601977 systemd[4308]: Finished Create User's Volatile Files and Directories.
Jan 30 03:25:18 np0005601977 systemd[4308]: Listening on D-Bus User Message Bus Socket.
Jan 30 03:25:18 np0005601977 systemd[4308]: Reached target Sockets.
Jan 30 03:25:18 np0005601977 systemd[4308]: Reached target Basic System.
Jan 30 03:25:18 np0005601977 systemd[4308]: Reached target Main User Target.
Jan 30 03:25:18 np0005601977 systemd[4308]: Startup finished in 154ms.
Jan 30 03:25:18 np0005601977 systemd[1]: Started User Manager for UID 1000.
Jan 30 03:25:18 np0005601977 systemd[1]: Started Session 1 of User zuul.
Jan 30 03:25:18 np0005601977 python3[4390]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:21 np0005601977 python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:29 np0005601977 python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:30 np0005601977 python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 30 03:25:32 np0005601977 python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbK+QNrWBUptwh/tsKUHyiCtL2MOslPM6Ok8BMCQo6bFrbUd3b41fHCXLJAKXUhQCi1jGZcObKr9aNeuB2QNxE8xc5bAMGWovKQ4u31cSz3+yCNPmCHNuVIMM7SCn/3SHL9lx+Mlgvr5y4LrvTxeqs+jgjWMwjgcOCmuiCK3sN+5XaVsPM8J8Q3hJc4oPZcz0m7hnNHwPiUCmUZ8/Fa3AZ2CT2rKka37F4HKzmhoCcGuEDcWJYfniS+jSLHe6v94wfn4yfX8Cni+Tg7PMFKfVoNp4bzlsWo7CJR9gC2d3Deo3PclTA3bdiQrDcq142qOGd9C16Ts0gaa5MtZXkkV8REKdfVwGoP8oDcePun1Bvrh+9fpQI050OnRQesD5MVe8uRKX9Li/HPu9YLN3L05zyP8HFNcli3m9jYS5EIwchiexzHMIP61qOGZEgGeKbRzB9UHRhepfxoBwFNaC77XNrSZoJIaWdzxtdC7LWxgN3HBIIXEk/1tRf6BVUZfU7Usk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:32 np0005601977 python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:33 np0005601977 python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:33 np0005601977 python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761533.188316-251-256874518542203/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6b9a517215ca414698aa6ebfe0ccc07e_id_rsa follow=False checksum=e031682c4cdaec7fe3b6bfdb180dc112513e48ea backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:34 np0005601977 python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:34 np0005601977 python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761534.04252-306-162629049085315/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6b9a517215ca414698aa6ebfe0ccc07e_id_rsa.pub follow=False checksum=605641a03b6ed0a70ecc2d0ac2aeac1b02553754 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:35 np0005601977 python3[4978]: ansible-ping Invoked with data=pong
Jan 30 03:25:37 np0005601977 python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:40 np0005601977 python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 30 03:25:41 np0005601977 python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:41 np0005601977 python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:41 np0005601977 python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601977 python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601977 python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601977 python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:44 np0005601977 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:45 np0005601977 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:45 np0005601977 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761545.0163116-31-188101150556832/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:46 np0005601977 python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:46 np0005601977 python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601977 python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601977 python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601977 python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601977 python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601977 python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601977 python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601977 python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601977 python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601977 python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601977 python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601977 python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601977 python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601977 python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601977 python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601977 python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601977 python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601977 python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601977 python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601977 python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601977 python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601977 python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601977 python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601977 python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601977 python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:56 np0005601977 python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 30 03:25:56 np0005601977 systemd[1]: Starting Time & Date Service...
Jan 30 03:25:56 np0005601977 systemd[1]: Started Time & Date Service.
Jan 30 03:25:56 np0005601977 systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Jan 30 03:25:57 np0005601977 python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:57 np0005601977 python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:58 np0005601977 python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769761557.5155466-251-30371911626362/source _original_basename=tmpopmgd_qr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:58 np0005601977 python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:59 np0005601977 python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769761558.4918573-301-237367230820874/source _original_basename=tmpbeho9spg follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:59 np0005601977 python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:26:00 np0005601977 python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769761559.727391-381-74400929387756/source _original_basename=tmp3jlmh8_h follow=False checksum=4ecd9067f9605648752e2a708ca0f3ad4bf42e85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:01 np0005601977 python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:01 np0005601977 python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:01 np0005601977 python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:26:02 np0005601977 python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761561.5045736-451-224178587890884/source _original_basename=tmpvzzrezg8 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:02 np0005601977 python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-ab5a-d783-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:03 np0005601977 python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-ab5a-d783-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 30 03:26:04 np0005601977 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:22 np0005601977 python3[6948]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:26 np0005601977 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 30 03:27:04 np0005601977 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 30 03:27:04 np0005601977 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9060] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 03:27:04 np0005601977 systemd-udevd[6951]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9268] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9291] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9294] device (eth1): carrier: link connected
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9295] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9300] policy: auto-activating connection 'Wired connection 1' (6c9678c7-6658-3f20-a071-454898fc78dd)
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9304] device (eth1): Activation: starting connection 'Wired connection 1' (6c9678c7-6658-3f20-a071-454898fc78dd)
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9305] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9308] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9312] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:27:04 np0005601977 NetworkManager[856]: <info>  [1769761624.9316] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:05 np0005601977 python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-3199-9bd3-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:27:12 np0005601977 python3[7058]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:27:13 np0005601977 python3[7131]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761632.6813936-104-146952331520389/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=4fa452e62614978a92ab3e285162a0ae212a19f9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:27:14 np0005601977 python3[7181]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 03:27:14 np0005601977 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 30 03:27:14 np0005601977 systemd[1]: Stopped Network Manager Wait Online.
Jan 30 03:27:14 np0005601977 systemd[1]: Stopping Network Manager Wait Online...
Jan 30 03:27:14 np0005601977 systemd[1]: Stopping Network Manager...
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1777] caught SIGTERM, shutting down normally.
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1788] dhcp4 (eth0): canceled DHCP transaction
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1789] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1790] dhcp4 (eth0): state changed no lease
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1794] manager: NetworkManager state is now CONNECTING
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1892] dhcp4 (eth1): canceled DHCP transaction
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1894] dhcp4 (eth1): state changed no lease
Jan 30 03:27:14 np0005601977 NetworkManager[856]: <info>  [1769761634.1949] exiting (success)
Jan 30 03:27:14 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:27:14 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:27:14 np0005601977 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 30 03:27:14 np0005601977 systemd[1]: Stopped Network Manager.
Jan 30 03:27:14 np0005601977 systemd[1]: NetworkManager.service: Consumed 1.296s CPU time, 9.9M memory peak.
Jan 30 03:27:14 np0005601977 systemd[1]: Starting Network Manager...
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.2450] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:c4148084-b675-426a-931e-95e26d0c5cd7)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.2454] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.2522] manager[0x560351034000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 03:27:14 np0005601977 systemd[1]: Starting Hostname Service...
Jan 30 03:27:14 np0005601977 systemd[1]: Started Hostname Service.
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3419] hostname: hostname: using hostnamed
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3422] hostname: static hostname changed from (none) to "np0005601977.novalocal"
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3428] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3434] manager[0x560351034000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3434] manager[0x560351034000]: rfkill: WWAN hardware radio set enabled
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3475] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3476] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3477] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3477] manager: Networking is enabled by state file
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3480] settings: Loaded settings plugin: keyfile (internal)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3486] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3537] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3554] dhcp: init: Using DHCP client 'internal'
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3559] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3565] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3574] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3585] device (lo): Activation: starting connection 'lo' (c19eaec1-d83a-4993-82a8-76b7a83473d3)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3593] device (eth0): carrier: link connected
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3598] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3605] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3605] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3614] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3623] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3633] device (eth1): carrier: link connected
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3639] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3650] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (6c9678c7-6658-3f20-a071-454898fc78dd) (indicated)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3650] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3659] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3669] device (eth1): Activation: starting connection 'Wired connection 1' (6c9678c7-6658-3f20-a071-454898fc78dd)
Jan 30 03:27:14 np0005601977 systemd[1]: Started Network Manager.
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3677] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3687] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3693] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3696] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3700] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3706] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3710] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3716] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3722] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3735] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3740] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3753] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3757] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3779] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3787] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3797] device (lo): Activation: successful, device activated.
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3809] dhcp4 (eth0): state changed new lease, address=38.102.83.194
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3821] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 03:27:14 np0005601977 systemd[1]: Starting Network Manager Wait Online...
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3893] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3912] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3914] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3917] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3920] device (eth0): Activation: successful, device activated.
Jan 30 03:27:14 np0005601977 NetworkManager[7189]: <info>  [1769761634.3926] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 03:27:14 np0005601977 python3[7265]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-3199-9bd3-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:27:24 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:27:44 np0005601977 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:27:55 np0005601977 systemd[4308]: Starting Mark boot as successful...
Jan 30 03:27:55 np0005601977 systemd[4308]: Finished Mark boot as successful.
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6062] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 03:27:59 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:27:59 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6410] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6414] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6424] device (eth1): Activation: successful, device activated.
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6434] manager: startup complete
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6438] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <warn>  [1769761679.6445] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6461] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 systemd[1]: Finished Network Manager Wait Online.
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6562] dhcp4 (eth1): canceled DHCP transaction
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6562] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6562] dhcp4 (eth1): state changed no lease
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6583] policy: auto-activating connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1)
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6589] device (eth1): Activation: starting connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1)
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6590] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6594] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6602] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6614] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6666] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6668] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 03:27:59 np0005601977 NetworkManager[7189]: <info>  [1769761679.6676] device (eth1): Activation: successful, device activated.
Jan 30 03:28:09 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:28:14 np0005601977 systemd-logind[809]: Session 1 logged out. Waiting for processes to exit.
Jan 30 03:28:58 np0005601977 systemd-logind[809]: New session 3 of user zuul.
Jan 30 03:28:58 np0005601977 systemd[1]: Started Session 3 of User zuul.
Jan 30 03:28:58 np0005601977 python3[7375]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:28:58 np0005601977 python3[7448]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761738.1137393-365-116673653154343/source _original_basename=tmpkm7btha4 follow=False checksum=5a54f5751b70d25169fc05ac686186710615a542 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:29:02 np0005601977 systemd[1]: session-3.scope: Deactivated successfully.
Jan 30 03:29:02 np0005601977 systemd-logind[809]: Session 3 logged out. Waiting for processes to exit.
Jan 30 03:29:02 np0005601977 systemd-logind[809]: Removed session 3.
Jan 30 03:30:55 np0005601977 systemd[4308]: Created slice User Background Tasks Slice.
Jan 30 03:30:55 np0005601977 systemd[4308]: Starting Cleanup of User's Temporary Files and Directories...
Jan 30 03:30:55 np0005601977 systemd[4308]: Finished Cleanup of User's Temporary Files and Directories.
Jan 30 03:36:33 np0005601977 systemd-logind[809]: New session 4 of user zuul.
Jan 30 03:36:33 np0005601977 systemd[1]: Started Session 4 of User zuul.
Jan 30 03:36:33 np0005601977 python3[7507]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-f7b8-4607-000000002181-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:34 np0005601977 python3[7536]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:34 np0005601977 python3[7562]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:34 np0005601977 python3[7588]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:35 np0005601977 python3[7614]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:35 np0005601977 python3[7640]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:36 np0005601977 python3[7718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:36:36 np0005601977 python3[7791]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769762196.128808-538-53283250113985/source _original_basename=tmpkol4hc1f follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:37 np0005601977 python3[7841]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 03:36:37 np0005601977 systemd[1]: Reloading.
Jan 30 03:36:37 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:36:39 np0005601977 python3[7898]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 30 03:36:39 np0005601977 python3[7924]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601977 python3[7952]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601977 python3[7980]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601977 python3[8008]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:41 np0005601977 python3[8035]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-f7b8-4607-000000002188-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:42 np0005601977 python3[8065]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 30 03:36:44 np0005601977 systemd-logind[809]: Session 4 logged out. Waiting for processes to exit.
Jan 30 03:36:44 np0005601977 systemd[1]: session-4.scope: Deactivated successfully.
Jan 30 03:36:44 np0005601977 systemd[1]: session-4.scope: Consumed 3.914s CPU time.
Jan 30 03:36:44 np0005601977 systemd-logind[809]: Removed session 4.
Jan 30 03:36:46 np0005601977 systemd-logind[809]: New session 5 of user zuul.
Jan 30 03:36:46 np0005601977 systemd[1]: Started Session 5 of User zuul.
Jan 30 03:36:47 np0005601977 python3[8100]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 30 03:36:54 np0005601977 setsebool[8142]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 30 03:36:54 np0005601977 setsebool[8142]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 30 03:37:05 np0005601977 kernel: SELinux:  Converting 385 SID table entries...
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:37:05 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  Converting 388 SID table entries...
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:37:14 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:37:33 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 30 03:37:33 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 03:37:33 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 03:37:33 np0005601977 systemd[1]: Reloading.
Jan 30 03:37:33 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:37:33 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 03:37:39 np0005601977 python3[12993]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-08d8-eacc-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:37:40 np0005601977 kernel: evm: overlay not supported
Jan 30 03:37:40 np0005601977 systemd[4308]: Starting D-Bus User Message Bus...
Jan 30 03:37:40 np0005601977 dbus-broker-launch[13861]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 30 03:37:40 np0005601977 dbus-broker-launch[13861]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 30 03:37:40 np0005601977 systemd[4308]: Started D-Bus User Message Bus.
Jan 30 03:37:40 np0005601977 dbus-broker-lau[13861]: Ready
Jan 30 03:37:40 np0005601977 systemd[4308]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 30 03:37:40 np0005601977 systemd[4308]: Created slice Slice /user.
Jan 30 03:37:40 np0005601977 systemd[4308]: podman-13740.scope: unit configures an IP firewall, but not running as root.
Jan 30 03:37:40 np0005601977 systemd[4308]: (This warning is only shown for the first unit using IP firewalling.)
Jan 30 03:37:40 np0005601977 systemd[4308]: Started podman-13740.scope.
Jan 30 03:37:40 np0005601977 systemd[4308]: Started podman-pause-f9d1412f.scope.
Jan 30 03:37:42 np0005601977 python3[14847]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.119:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.119:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:37:42 np0005601977 python3[14847]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 30 03:37:43 np0005601977 systemd[1]: session-5.scope: Deactivated successfully.
Jan 30 03:37:43 np0005601977 systemd[1]: session-5.scope: Consumed 41.203s CPU time.
Jan 30 03:37:43 np0005601977 systemd-logind[809]: Session 5 logged out. Waiting for processes to exit.
Jan 30 03:37:43 np0005601977 systemd-logind[809]: Removed session 5.
Jan 30 03:38:08 np0005601977 systemd-logind[809]: New session 6 of user zuul.
Jan 30 03:38:08 np0005601977 systemd[1]: Started Session 6 of User zuul.
Jan 30 03:38:08 np0005601977 python3[25006]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:09 np0005601977 python3[25206]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:09 np0005601977 python3[25627]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005601977.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 30 03:38:10 np0005601977 python3[25936]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:11 np0005601977 python3[26131]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:38:11 np0005601977 python3[26340]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769762291.0777435-167-210852488381039/source _original_basename=tmptibue5xz follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:38:12 np0005601977 python3[26614]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 30 03:38:12 np0005601977 systemd[1]: Starting Hostname Service...
Jan 30 03:38:12 np0005601977 systemd[1]: Started Hostname Service.
Jan 30 03:38:12 np0005601977 systemd-hostnamed[26720]: Changed pretty hostname to 'compute-0'
Jan 30 03:38:12 np0005601977 systemd-hostnamed[26720]: Hostname set to <compute-0> (static)
Jan 30 03:38:12 np0005601977 NetworkManager[7189]: <info>  [1769762292.8063] hostname: static hostname changed from "np0005601977.novalocal" to "compute-0"
Jan 30 03:38:12 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:38:12 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:38:13 np0005601977 systemd[1]: session-6.scope: Deactivated successfully.
Jan 30 03:38:13 np0005601977 systemd[1]: session-6.scope: Consumed 2.189s CPU time.
Jan 30 03:38:13 np0005601977 systemd-logind[809]: Session 6 logged out. Waiting for processes to exit.
Jan 30 03:38:13 np0005601977 systemd-logind[809]: Removed session 6.
Jan 30 03:38:22 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 03:38:22 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 03:38:22 np0005601977 systemd[1]: man-db-cache-update.service: Consumed 45.885s CPU time.
Jan 30 03:38:22 np0005601977 systemd[1]: run-r048097a4e545407d80b3033f7f6821c7.service: Deactivated successfully.
Jan 30 03:38:22 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:38:42 np0005601977 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:39:45 np0005601977 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 30 03:39:45 np0005601977 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 30 03:39:45 np0005601977 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 30 03:39:45 np0005601977 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 30 03:42:17 np0005601977 systemd-logind[809]: New session 7 of user zuul.
Jan 30 03:42:17 np0005601977 systemd[1]: Started Session 7 of User zuul.
Jan 30 03:42:17 np0005601977 python3[30070]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:42:19 np0005601977 python3[30186]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:19 np0005601977 python3[30259]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:20 np0005601977 python3[30285]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:20 np0005601977 python3[30358]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:20 np0005601977 python3[30384]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:20 np0005601977 python3[30457]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:21 np0005601977 systemd[1]: Starting dnf makecache...
Jan 30 03:42:21 np0005601977 python3[30484]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:21 np0005601977 dnf[30483]: Failed determining last makecache time.
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-barbican-42b4c41831408a8e323 417 kB/s |  13 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.9 MB/s |  65 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Jan 30 03:42:21 np0005601977 python3[30558]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-python-stevedore-c4acc5639fd2329372142 4.7 MB/s | 131 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.2 MB/s |  32 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-os-refresh-config-9bfc52b5049be2d8de61 7.3 MB/s | 349 kB     00:00
Jan 30 03:42:21 np0005601977 python3[30598]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 755 kB/s |  42 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-python-designate-tests-tempest-347fdbc 790 kB/s |  18 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-glance-1fd12c29b339f30fe823e 755 kB/s |  18 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.0 MB/s |  29 kB     00:00
Jan 30 03:42:21 np0005601977 dnf[30483]: delorean-openstack-manila-3c01b7181572c95dac462 287 kB/s |  25 kB     00:00
Jan 30 03:42:21 np0005601977 python3[30684]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-python-whitebox-neutron-tests-tempest- 4.8 MB/s | 154 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-openstack-octavia-ba397f07a7331190208c 1.0 MB/s |  26 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-openstack-watcher-c014f81a8647287f6dcc 762 kB/s |  16 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-ansible-config_template-5ccaa22121a7ff 377 kB/s | 7.4 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.1 MB/s | 144 kB     00:00
Jan 30 03:42:22 np0005601977 python3[30718]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-openstack-swift-dc98a8463506ac520c469a 641 kB/s |  14 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-python-tempestconf-8515371b7cceebd4282 2.1 MB/s |  53 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.9 MB/s |  96 kB     00:00
Jan 30 03:42:22 np0005601977 dnf[30483]: CentOS Stream 9 - BaseOS                         59 kB/s | 6.1 kB     00:00
Jan 30 03:42:22 np0005601977 python3[30812]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:22 np0005601977 dnf[30483]: CentOS Stream 9 - AppStream                      61 kB/s | 6.2 kB     00:00
Jan 30 03:42:22 np0005601977 python3[30840]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:22 np0005601977 dnf[30483]: CentOS Stream 9 - CRB                            59 kB/s | 6.0 kB     00:00
Jan 30 03:42:23 np0005601977 python3[30914]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.1028478-34125-213896461484009/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:23 np0005601977 dnf[30483]: CentOS Stream 9 - Extras packages                31 kB/s | 7.3 kB     00:00
Jan 30 03:42:23 np0005601977 dnf[30483]: dlrn-antelope-testing                           6.0 MB/s | 1.1 MB     00:00
Jan 30 03:42:23 np0005601977 dnf[30483]: dlrn-antelope-build-deps                         15 MB/s | 461 kB     00:00
Jan 30 03:42:23 np0005601977 dnf[30483]: centos9-rabbitmq                                7.9 MB/s | 123 kB     00:00
Jan 30 03:42:24 np0005601977 dnf[30483]: centos9-storage                                 4.9 MB/s | 415 kB     00:00
Jan 30 03:42:24 np0005601977 dnf[30483]: centos9-opstools                                1.5 MB/s |  51 kB     00:00
Jan 30 03:42:24 np0005601977 dnf[30483]: NFV SIG OpenvSwitch                              27 MB/s | 461 kB     00:00
Jan 30 03:42:24 np0005601977 dnf[30483]: repo-setup-centos-highavailability               34 MB/s | 744 kB     00:00
Jan 30 03:42:26 np0005601977 dnf[30483]: Extra Packages for Enterprise Linux 9 - x86_64   11 MB/s |  20 MB     00:01
Jan 30 03:42:33 np0005601977 python3[31000]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:42:38 np0005601977 dnf[30483]: Metadata cache created.
Jan 30 03:42:38 np0005601977 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 30 03:42:38 np0005601977 systemd[1]: Finished dnf makecache.
Jan 30 03:42:38 np0005601977 systemd[1]: dnf-makecache.service: Consumed 13.920s CPU time.
Jan 30 03:47:32 np0005601977 systemd[1]: session-7.scope: Deactivated successfully.
Jan 30 03:47:32 np0005601977 systemd[1]: session-7.scope: Consumed 4.403s CPU time.
Jan 30 03:47:32 np0005601977 systemd-logind[809]: Session 7 logged out. Waiting for processes to exit.
Jan 30 03:47:32 np0005601977 systemd-logind[809]: Removed session 7.
Jan 30 03:58:09 np0005601977 systemd-logind[809]: New session 8 of user zuul.
Jan 30 03:58:09 np0005601977 systemd[1]: Started Session 8 of User zuul.
Jan 30 03:58:10 np0005601977 python3.9[31163]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:11 np0005601977 python3.9[31344]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:58:19 np0005601977 systemd[1]: session-8.scope: Deactivated successfully.
Jan 30 03:58:19 np0005601977 systemd[1]: session-8.scope: Consumed 7.244s CPU time.
Jan 30 03:58:19 np0005601977 systemd-logind[809]: Session 8 logged out. Waiting for processes to exit.
Jan 30 03:58:19 np0005601977 systemd-logind[809]: Removed session 8.
Jan 30 03:58:24 np0005601977 systemd-logind[809]: New session 9 of user zuul.
Jan 30 03:58:25 np0005601977 systemd[1]: Started Session 9 of User zuul.
Jan 30 03:58:26 np0005601977 python3.9[31556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:26 np0005601977 systemd[1]: session-9.scope: Deactivated successfully.
Jan 30 03:58:26 np0005601977 systemd-logind[809]: Session 9 logged out. Waiting for processes to exit.
Jan 30 03:58:26 np0005601977 systemd-logind[809]: Removed session 9.
Jan 30 03:58:42 np0005601977 systemd-logind[809]: New session 10 of user zuul.
Jan 30 03:58:42 np0005601977 systemd[1]: Started Session 10 of User zuul.
Jan 30 03:58:42 np0005601977 python3.9[31737]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 30 03:58:43 np0005601977 python3.9[31911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:44 np0005601977 python3.9[32063]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:58:45 np0005601977 python3.9[32216]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 03:58:46 np0005601977 python3.9[32368]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:47 np0005601977 python3.9[32520]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 03:58:47 np0005601977 python3.9[32643]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763526.718982-172-221139996779551/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:48 np0005601977 python3.9[32795]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:49 np0005601977 python3.9[32951]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 03:58:49 np0005601977 python3.9[33103]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 03:58:50 np0005601977 python3.9[33253]: ansible-ansible.builtin.service_facts Invoked
Jan 30 03:58:53 np0005601977 python3.9[33506]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:54 np0005601977 python3.9[33656]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:55 np0005601977 python3.9[33810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:56 np0005601977 python3.9[33968]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 03:58:57 np0005601977 python3.9[34052]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 03:59:40 np0005601977 systemd[1]: Reloading.
Jan 30 03:59:40 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:40 np0005601977 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 30 03:59:41 np0005601977 systemd[1]: Reloading.
Jan 30 03:59:41 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:41 np0005601977 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 30 03:59:41 np0005601977 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 30 03:59:41 np0005601977 systemd[1]: Reloading.
Jan 30 03:59:41 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:41 np0005601977 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 30 03:59:41 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 03:59:41 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 03:59:41 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 04:00:37 np0005601977 kernel: SELinux:  Converting 2728 SID table entries...
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:00:37 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:00:37 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 30 04:00:38 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:00:38 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:00:38 np0005601977 systemd[1]: Reloading.
Jan 30 04:00:38 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:00:38 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:00:38 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:00:38 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:00:38 np0005601977 systemd[1]: run-rf8902cbfb1bd4fdf8c71ee3db30a2de2.service: Deactivated successfully.
Jan 30 04:00:45 np0005601977 python3.9[35582]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:00:47 np0005601977 python3.9[35863]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 30 04:00:48 np0005601977 python3.9[36015]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 30 04:00:50 np0005601977 python3.9[36168]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:00:53 np0005601977 python3.9[36320]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 30 04:00:57 np0005601977 python3.9[36472]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:00 np0005601977 python3.9[36624]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:01 np0005601977 python3.9[36747]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763660.313405-661-134016527362904/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:02 np0005601977 python3.9[36914]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:03 np0005601977 python3.9[37066]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:03 np0005601977 python3.9[37219]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:05 np0005601977 python3.9[37371]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 30 04:01:05 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:01:05 np0005601977 python3.9[37525]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:01:06 np0005601977 python3.9[37683]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:01:07 np0005601977 python3.9[37843]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 30 04:01:08 np0005601977 python3.9[37996]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:01:09 np0005601977 python3.9[38154]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 30 04:01:10 np0005601977 python3.9[38306]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:12 np0005601977 python3.9[38459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:13 np0005601977 python3.9[38611]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:13 np0005601977 python3.9[38734]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763673.0114243-1018-230156106233530/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:15 np0005601977 python3.9[38886]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:01:15 np0005601977 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:01:15 np0005601977 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 30 04:01:15 np0005601977 kernel: Bridge firewalling registered
Jan 30 04:01:15 np0005601977 systemd-modules-load[38890]: Inserted module 'br_netfilter'
Jan 30 04:01:15 np0005601977 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:01:16 np0005601977 python3.9[39045]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:16 np0005601977 python3.9[39168]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763675.5982969-1087-36317495371363/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:17 np0005601977 python3.9[39320]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:20 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 04:01:20 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 04:01:20 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:01:20 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:01:20 np0005601977 systemd[1]: Reloading.
Jan 30 04:01:20 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:21 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:01:23 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:01:23 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:01:23 np0005601977 systemd[1]: man-db-cache-update.service: Consumed 3.193s CPU time.
Jan 30 04:01:23 np0005601977 systemd[1]: run-r4a5f83e04e4c422899f69874aaca71b3.service: Deactivated successfully.
Jan 30 04:01:24 np0005601977 python3.9[43070]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:25 np0005601977 python3.9[43222]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 30 04:01:26 np0005601977 python3.9[43372]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:26 np0005601977 python3.9[43524]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:27 np0005601977 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 30 04:01:27 np0005601977 systemd[1]: Starting Authorization Manager...
Jan 30 04:01:27 np0005601977 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 30 04:01:27 np0005601977 polkitd[43741]: Started polkitd version 0.117
Jan 30 04:01:27 np0005601977 systemd[1]: Started Authorization Manager.
Jan 30 04:01:28 np0005601977 python3.9[43911]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:28 np0005601977 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 30 04:01:28 np0005601977 systemd[1]: tuned.service: Deactivated successfully.
Jan 30 04:01:28 np0005601977 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 30 04:01:28 np0005601977 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 30 04:01:28 np0005601977 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 30 04:01:29 np0005601977 python3.9[44073]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 30 04:01:33 np0005601977 python3.9[44225]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:33 np0005601977 systemd[1]: Reloading.
Jan 30 04:01:33 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:34 np0005601977 python3.9[44415]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:34 np0005601977 systemd[1]: Reloading.
Jan 30 04:01:34 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:35 np0005601977 python3.9[44605]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:36 np0005601977 python3.9[44758]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:36 np0005601977 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 30 04:01:36 np0005601977 python3.9[44911]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:39 np0005601977 python3.9[45073]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:39 np0005601977 python3.9[45226]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:01:39 np0005601977 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 30 04:01:39 np0005601977 systemd[1]: Stopped Apply Kernel Variables.
Jan 30 04:01:39 np0005601977 systemd[1]: Stopping Apply Kernel Variables...
Jan 30 04:01:39 np0005601977 systemd[1]: Starting Apply Kernel Variables...
Jan 30 04:01:39 np0005601977 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 30 04:01:39 np0005601977 systemd[1]: Finished Apply Kernel Variables.
Jan 30 04:01:40 np0005601977 systemd[1]: session-10.scope: Deactivated successfully.
Jan 30 04:01:40 np0005601977 systemd[1]: session-10.scope: Consumed 2min 1.151s CPU time.
Jan 30 04:01:40 np0005601977 systemd-logind[809]: Session 10 logged out. Waiting for processes to exit.
Jan 30 04:01:40 np0005601977 systemd-logind[809]: Removed session 10.
Jan 30 04:01:45 np0005601977 systemd-logind[809]: New session 11 of user zuul.
Jan 30 04:01:45 np0005601977 systemd[1]: Started Session 11 of User zuul.
Jan 30 04:01:46 np0005601977 python3.9[45410]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:48 np0005601977 python3.9[45564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:49 np0005601977 python3.9[45720]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:50 np0005601977 python3.9[45871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:51 np0005601977 python3.9[46027]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:01:52 np0005601977 python3.9[46111]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:54 np0005601977 python3.9[46264]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:01:55 np0005601977 python3.9[46435]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:55 np0005601977 python3.9[46587]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:55 np0005601977 systemd[1]: var-lib-containers-storage-overlay-compat3754794544-merged.mount: Deactivated successfully.
Jan 30 04:01:55 np0005601977 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck878555077-merged.mount: Deactivated successfully.
Jan 30 04:01:55 np0005601977 podman[46588]: 2026-01-30 09:01:55.893956236 +0000 UTC m=+0.056981728 system refresh
Jan 30 04:01:56 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:01:57 np0005601977 python3.9[46750]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:57 np0005601977 python3.9[46873]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763716.4848638-282-235019439189590/.source.json follow=False _original_basename=podman_network_config.j2 checksum=0e888792d62795051622a14ea3ae623365cd0da1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:58 np0005601977 python3.9[47025]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:58 np0005601977 python3.9[47148]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763717.877803-327-233377837765080/.source.conf follow=False _original_basename=registries.conf.j2 checksum=4891ae8372aa80a8fa92515759173ef122bd9c5c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:59 np0005601977 python3.9[47300]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:00 np0005601977 python3.9[47452]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:00 np0005601977 python3.9[47604]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:01 np0005601977 python3.9[47756]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:02 np0005601977 python3.9[47906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:02:03 np0005601977 python3.9[48060]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:05 np0005601977 python3.9[48213]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:07 np0005601977 python3.9[48374]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:10 np0005601977 python3.9[48527]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:12 np0005601977 python3.9[48680]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:14 np0005601977 python3.9[48836]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:18 np0005601977 python3.9[49004]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:20 np0005601977 python3.9[49157]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:32 np0005601977 python3.9[49491]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:34 np0005601977 python3.9[49647]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:37 np0005601977 python3.9[49804]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:02:37 np0005601977 python3.9[49979]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:02:38 np0005601977 python3.9[50102]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769763757.498476-801-41911908778810/.source.json _original_basename=.qbock09f follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:02:39 np0005601977 python3.9[50254]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:02:39 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:41 np0005601977 systemd[1]: var-lib-containers-storage-overlay-compat561156212-lower\x2dmapped.mount: Deactivated successfully.
Jan 30 04:02:44 np0005601977 podman[50267]: 2026-01-30 09:02:44.495887268 +0000 UTC m=+4.974561086 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:02:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:46 np0005601977 python3.9[50564]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:02:46 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:54 np0005601977 podman[50575]: 2026-01-30 09:02:54.679301976 +0000 UTC m=+7.716493780 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:02:54 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:54 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:54 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:00 np0005601977 python3.9[50872]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:00 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:10 np0005601977 podman[50884]: 2026-01-30 09:03:10.987321235 +0000 UTC m=+10.226554330 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:03:10 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:11 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:11 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:16 np0005601977 python3.9[51140]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:16 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:18 np0005601977 podman[51152]: 2026-01-30 09:03:18.7128834 +0000 UTC m=+2.476885075 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 30 04:03:18 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:18 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:18 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:19 np0005601977 python3.9[51408]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:20 np0005601977 podman[51420]: 2026-01-30 09:03:20.793195802 +0000 UTC m=+1.294155714 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 30 04:03:20 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:20 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:20 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:24 np0005601977 systemd-logind[809]: Session 11 logged out. Waiting for processes to exit.
Jan 30 04:03:24 np0005601977 systemd[1]: session-11.scope: Deactivated successfully.
Jan 30 04:03:24 np0005601977 systemd[1]: session-11.scope: Consumed 1min 31.153s CPU time.
Jan 30 04:03:24 np0005601977 systemd-logind[809]: Removed session 11.
Jan 30 04:03:30 np0005601977 systemd-logind[809]: New session 12 of user zuul.
Jan 30 04:03:30 np0005601977 systemd[1]: Started Session 12 of User zuul.
Jan 30 04:03:31 np0005601977 python3.9[51719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:03:32 np0005601977 python3.9[51875]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 30 04:03:32 np0005601977 python3.9[52028]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:03:33 np0005601977 python3.9[52186]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:03:34 np0005601977 python3.9[52346]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:03:35 np0005601977 python3.9[52430]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:03:38 np0005601977 python3.9[52591]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:03:51 np0005601977 kernel: SELinux:  Converting 2741 SID table entries...
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:03:51 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:03:51 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 30 04:03:51 np0005601977 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 30 04:03:52 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:03:52 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:03:52 np0005601977 systemd[1]: Reloading.
Jan 30 04:03:52 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:03:52 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:03:52 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:03:53 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:03:53 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:03:53 np0005601977 systemd[1]: run-r01e001eb3c514b288834c2da419ba173.service: Deactivated successfully.
Jan 30 04:03:55 np0005601977 python3.9[53691]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:03:55 np0005601977 systemd[1]: Reloading.
Jan 30 04:03:55 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:03:55 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:03:55 np0005601977 systemd[1]: Starting Open vSwitch Database Unit...
Jan 30 04:03:55 np0005601977 chown[53733]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 30 04:03:55 np0005601977 ovs-ctl[53738]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 30 04:03:55 np0005601977 ovs-ctl[53738]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 30 04:03:55 np0005601977 ovs-ctl[53738]: Starting ovsdb-server [  OK  ]
Jan 30 04:03:55 np0005601977 ovs-vsctl[53787]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 30 04:03:55 np0005601977 ovs-vsctl[53803]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"9be64184-856f-4986-a80e-9403fa35a6a5\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 30 04:03:55 np0005601977 ovs-ctl[53738]: Configuring Open vSwitch system IDs [  OK  ]
Jan 30 04:03:55 np0005601977 ovs-ctl[53738]: Enabling remote OVSDB managers [  OK  ]
Jan 30 04:03:55 np0005601977 ovs-vsctl[53813]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 30 04:03:55 np0005601977 systemd[1]: Started Open vSwitch Database Unit.
Jan 30 04:03:55 np0005601977 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 30 04:03:55 np0005601977 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 30 04:03:55 np0005601977 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 30 04:03:55 np0005601977 kernel: openvswitch: Open vSwitch switching datapath
Jan 30 04:03:55 np0005601977 ovs-ctl[53857]: Inserting openvswitch module [  OK  ]
Jan 30 04:03:56 np0005601977 ovs-ctl[53826]: Starting ovs-vswitchd [  OK  ]
Jan 30 04:03:56 np0005601977 ovs-ctl[53826]: Enabling remote OVSDB managers [  OK  ]
Jan 30 04:03:56 np0005601977 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 30 04:03:56 np0005601977 ovs-vsctl[53874]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 30 04:03:56 np0005601977 systemd[1]: Starting Open vSwitch...
Jan 30 04:03:56 np0005601977 systemd[1]: Finished Open vSwitch.
Jan 30 04:03:56 np0005601977 python3.9[54026]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:03:57 np0005601977 python3.9[54178]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 30 04:03:59 np0005601977 kernel: SELinux:  Converting 2755 SID table entries...
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:03:59 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:04:00 np0005601977 python3.9[54333]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:04:01 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 30 04:04:01 np0005601977 python3.9[54491]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:03 np0005601977 python3.9[54644]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:04:05 np0005601977 python3.9[54931]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 30 04:04:06 np0005601977 python3.9[55081]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:04:06 np0005601977 python3.9[55235]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:08 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:04:08 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:04:08 np0005601977 systemd[1]: Reloading.
Jan 30 04:04:08 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:04:08 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:04:08 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:04:09 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:04:09 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:04:09 np0005601977 systemd[1]: run-r5ec3a96c30f94d7d9be9ed7090080929.service: Deactivated successfully.
Jan 30 04:04:10 np0005601977 python3.9[55552]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:04:10 np0005601977 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 30 04:04:10 np0005601977 systemd[1]: Stopped Network Manager Wait Online.
Jan 30 04:04:10 np0005601977 systemd[1]: Stopping Network Manager Wait Online...
Jan 30 04:04:10 np0005601977 systemd[1]: Stopping Network Manager...
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2085] caught SIGTERM, shutting down normally.
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2108] dhcp4 (eth0): canceled DHCP transaction
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2108] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2108] dhcp4 (eth0): state changed no lease
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2112] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 04:04:10 np0005601977 NetworkManager[7189]: <info>  [1769763850.2192] exiting (success)
Jan 30 04:04:10 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 04:04:10 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 04:04:10 np0005601977 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 30 04:04:10 np0005601977 systemd[1]: Stopped Network Manager.
Jan 30 04:04:10 np0005601977 systemd[1]: NetworkManager.service: Consumed 16.630s CPU time, 4.1M memory peak, read 0B from disk, written 44.5K to disk.
Jan 30 04:04:10 np0005601977 systemd[1]: Starting Network Manager...
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.2935] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:c4148084-b675-426a-931e-95e26d0c5cd7)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.2937] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.2984] manager[0x55736d06d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 04:04:10 np0005601977 systemd[1]: Starting Hostname Service...
Jan 30 04:04:10 np0005601977 systemd[1]: Started Hostname Service.
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3787] hostname: hostname: using hostnamed
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3789] hostname: static hostname changed from (none) to "compute-0"
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3797] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3805] manager[0x55736d06d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3805] manager[0x55736d06d000]: rfkill: WWAN hardware radio set enabled
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3837] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3849] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3849] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3850] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3851] manager: Networking is enabled by state file
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3854] settings: Loaded settings plugin: keyfile (internal)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3858] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3893] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3905] dhcp: init: Using DHCP client 'internal'
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3909] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3915] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3922] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3932] device (lo): Activation: starting connection 'lo' (c19eaec1-d83a-4993-82a8-76b7a83473d3)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3939] device (eth0): carrier: link connected
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3944] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3950] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3950] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3957] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3966] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3973] device (eth1): carrier: link connected
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3976] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3983] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1) (indicated)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3984] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.3993] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4005] device (eth1): Activation: starting connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1)
Jan 30 04:04:10 np0005601977 systemd[1]: Started Network Manager.
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4015] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4033] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4038] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4041] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4043] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4046] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4049] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4052] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4056] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4063] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4066] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4074] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4085] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4094] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4096] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4101] device (lo): Activation: successful, device activated.
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4108] dhcp4 (eth0): state changed new lease, address=38.102.83.194
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4114] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 04:04:10 np0005601977 systemd[1]: Starting Network Manager Wait Online...
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4271] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4286] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4291] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4294] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4296] device (eth1): Activation: successful, device activated.
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4325] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4327] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4330] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4332] device (eth0): Activation: successful, device activated.
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4336] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 04:04:10 np0005601977 NetworkManager[55565]: <info>  [1769763850.4338] manager: startup complete
Jan 30 04:04:10 np0005601977 systemd[1]: Finished Network Manager Wait Online.
Jan 30 04:04:11 np0005601977 python3.9[55779]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:17 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:04:17 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:04:17 np0005601977 systemd[1]: Reloading.
Jan 30 04:04:17 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:04:17 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:04:17 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:04:20 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 04:04:20 np0005601977 python3.9[56239]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:04:20 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:04:20 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:04:20 np0005601977 systemd[1]: run-ra5109d169a644980be7cc2660c4ece30.service: Deactivated successfully.
Jan 30 04:04:21 np0005601977 python3.9[56392]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:22 np0005601977 python3.9[56546]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:23 np0005601977 python3.9[56698]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:23 np0005601977 python3.9[56850]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:24 np0005601977 python3.9[57002]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:25 np0005601977 python3.9[57154]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:25 np0005601977 python3.9[57277]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763864.636613-642-150409792194892/.source _original_basename=.56cf4ba8 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:26 np0005601977 python3.9[57429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:27 np0005601977 python3.9[57581]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 30 04:04:27 np0005601977 python3.9[57733]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:30 np0005601977 python3.9[58160]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 30 04:04:31 np0005601977 ansible-async_wrapper.py[58335]: Invoked with j1499334465 300 /home/zuul/.ansible/tmp/ansible-tmp-1769763870.3220658-840-81450956398801/AnsiballZ_edpm_os_net_config.py _
Jan 30 04:04:31 np0005601977 ansible-async_wrapper.py[58338]: Starting module and watcher
Jan 30 04:04:31 np0005601977 ansible-async_wrapper.py[58338]: Start watching 58339 (300)
Jan 30 04:04:31 np0005601977 ansible-async_wrapper.py[58339]: Start module (58339)
Jan 30 04:04:31 np0005601977 ansible-async_wrapper.py[58335]: Return async_wrapper task started.
Jan 30 04:04:31 np0005601977 python3.9[58340]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 30 04:04:32 np0005601977 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 30 04:04:32 np0005601977 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 30 04:04:32 np0005601977 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 30 04:04:32 np0005601977 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 30 04:04:32 np0005601977 kernel: cfg80211: failed to load regulatory.db
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.2808] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.2826] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3368] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3370] audit: op="connection-add" uuid="4fbd9405-0f55-47b1-bf43-17d74ed27365" name="br-ex-br" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3384] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3386] audit: op="connection-add" uuid="cddbd553-eba7-46a0-ae65-4772d460998c" name="br-ex-port" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3398] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3400] audit: op="connection-add" uuid="b6002d29-c687-47cb-939a-e889cb062aad" name="eth1-port" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3411] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3412] audit: op="connection-add" uuid="6fa33c5b-5191-485c-ad08-bdf0fcedd49b" name="vlan20-port" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3423] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3425] audit: op="connection-add" uuid="bb6bb6cb-2681-4529-8c29-92b6cf52cea4" name="vlan21-port" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3436] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3438] audit: op="connection-add" uuid="05977e10-7132-4810-8275-02a4d71b492a" name="vlan22-port" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3456] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3473] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.3474] audit: op="connection-add" uuid="14d1ae43-0e65-4180-9be7-1dcb88218230" name="br-ex-if" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4037] audit: op="connection-update" uuid="694eea5b-2a08-5ef5-80f9-9cd3b604ccf1" name="ci-private-network" args="ipv6.dns,ipv6.method,ipv6.addresses,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ovs-external-ids.data,ovs-interface.type,connection.port-type,connection.slave-type,connection.controller,connection.timestamp,connection.master" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4058] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4060] audit: op="connection-add" uuid="7347fcfb-bd51-4838-ad35-e7e118871822" name="vlan20-if" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4079] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4081] audit: op="connection-add" uuid="1e3a5748-844b-4c57-bb0c-1564c0d1dcf6" name="vlan21-if" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4099] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4100] audit: op="connection-add" uuid="e5254972-197d-48df-a0dc-3cff69dcfc31" name="vlan22-if" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4114] audit: op="connection-delete" uuid="6c9678c7-6658-3f20-a071-454898fc78dd" name="Wired connection 1" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4127] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4129] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4135] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4139] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (4fbd9405-0f55-47b1-bf43-17d74ed27365)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4139] audit: op="connection-activate" uuid="4fbd9405-0f55-47b1-bf43-17d74ed27365" name="br-ex-br" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4141] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4142] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4148] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4154] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (cddbd553-eba7-46a0-ae65-4772d460998c)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4156] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4157] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4162] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4166] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (b6002d29-c687-47cb-939a-e889cb062aad)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4168] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4169] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4174] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4178] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (6fa33c5b-5191-485c-ad08-bdf0fcedd49b)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4180] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4181] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4187] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4191] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (bb6bb6cb-2681-4529-8c29-92b6cf52cea4)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4193] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4194] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4200] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4204] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (05977e10-7132-4810-8275-02a4d71b492a)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4205] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4208] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4210] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4216] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4217] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4221] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4226] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (14d1ae43-0e65-4180-9be7-1dcb88218230)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4227] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4230] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4232] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4234] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4235] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4247] device (eth1): disconnecting for new activation request.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4248] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4250] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4253] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4254] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4257] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4258] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4261] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4265] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7347fcfb-bd51-4838-ad35-e7e118871822)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4266] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4269] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4271] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4272] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4275] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4276] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4280] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4284] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (1e3a5748-844b-4c57-bb0c-1564c0d1dcf6)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4285] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4288] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4290] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4291] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4294] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <warn>  [1769763873.4295] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4298] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4303] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e5254972-197d-48df-a0dc-3cff69dcfc31)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4303] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4307] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4308] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4310] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4311] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4328] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,802-3-ethernet.mtu" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4330] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4335] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4337] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4345] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4349] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4354] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 kernel: ovs-system: entered promiscuous mode
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4367] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4368] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4373] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4377] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4381] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4385] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 kernel: Timeout policy base is empty
Jan 30 04:04:33 np0005601977 systemd-udevd[58346]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4391] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4394] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4397] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4398] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4403] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4407] dhcp4 (eth0): canceled DHCP transaction
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4407] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4407] dhcp4 (eth0): state changed no lease
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4409] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4416] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4419] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58341 uid=0 result="fail" reason="Device is not activated"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4425] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 04:04:33 np0005601977 kernel: br-ex: entered promiscuous mode
Jan 30 04:04:33 np0005601977 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 04:04:33 np0005601977 kernel: vlan20: entered promiscuous mode
Jan 30 04:04:33 np0005601977 systemd-udevd[58344]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601977 kernel: vlan21: entered promiscuous mode
Jan 30 04:04:33 np0005601977 systemd-udevd[58345]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4965] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4971] dhcp4 (eth0): state changed new lease, address=38.102.83.194
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4983] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.4998] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5006] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5014] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601977 kernel: vlan22: entered promiscuous mode
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5619] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5722] device (eth1): Activation: starting connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5729] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5731] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5734] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5736] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5738] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5740] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5743] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5754] device (eth1): disconnecting for new activation request.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5755] audit: op="connection-activate" uuid="694eea5b-2a08-5ef5-80f9-9cd3b604ccf1" name="ci-private-network" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5784] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5789] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5798] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5808] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5820] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5827] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5832] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5837] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5842] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5847] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5852] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5857] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5862] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5897] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5909] device (eth1): Activation: starting connection 'ci-private-network' (694eea5b-2a08-5ef5-80f9-9cd3b604ccf1)
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5915] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58341 uid=0 result="success"
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5919] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5949] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5959] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5975] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.5986] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6013] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6028] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6033] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 30 04:04:33 np0005601977 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6047] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6069] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6079] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6088] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6103] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6106] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6121] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6128] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6137] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6145] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6154] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6156] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6163] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6171] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6180] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601977 NetworkManager[55565]: <info>  [1769763873.6189] device (eth1): Activation: successful, device activated.
Jan 30 04:04:34 np0005601977 NetworkManager[55565]: <info>  [1769763874.7542] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58341 uid=0 result="success"
Jan 30 04:04:34 np0005601977 python3.9[58679]: ansible-ansible.legacy.async_status Invoked with jid=j1499334465.58335 mode=status _async_dir=/root/.ansible_async
Jan 30 04:04:34 np0005601977 NetworkManager[55565]: <info>  [1769763874.9276] checkpoint[0x55736d042950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 30 04:04:34 np0005601977 NetworkManager[55565]: <info>  [1769763874.9279] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.1672] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.1685] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.4829] audit: op="networking-control" arg="global-dns-configuration" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.5284] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.6643] audit: op="networking-control" arg="global-dns-configuration" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.6676] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.8029] checkpoint[0x55736d042a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 30 04:04:35 np0005601977 NetworkManager[55565]: <info>  [1769763875.8034] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58341 uid=0 result="success"
Jan 30 04:04:35 np0005601977 ansible-async_wrapper.py[58339]: Module complete (58339)
Jan 30 04:04:36 np0005601977 ansible-async_wrapper.py[58338]: Done in kid B.
Jan 30 04:04:38 np0005601977 python3.9[58785]: ansible-ansible.legacy.async_status Invoked with jid=j1499334465.58335 mode=status _async_dir=/root/.ansible_async
Jan 30 04:04:39 np0005601977 python3.9[58885]: ansible-ansible.legacy.async_status Invoked with jid=j1499334465.58335 mode=cleanup _async_dir=/root/.ansible_async
Jan 30 04:04:39 np0005601977 python3.9[59037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:40 np0005601977 python3.9[59160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763879.3951309-921-281074279914874/.source.returncode _original_basename=.gb0z5i16 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:40 np0005601977 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 04:04:41 np0005601977 python3.9[59314]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:41 np0005601977 python3.9[59437]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763880.5906749-969-262585494212219/.source.cfg _original_basename=.lm19xajr follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:42 np0005601977 python3.9[59590]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:04:42 np0005601977 systemd[1]: Reloading Network Manager...
Jan 30 04:04:42 np0005601977 NetworkManager[55565]: <info>  [1769763882.3985] audit: op="reload" arg="0" pid=59594 uid=0 result="success"
Jan 30 04:04:42 np0005601977 NetworkManager[55565]: <info>  [1769763882.3996] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 30 04:04:42 np0005601977 systemd[1]: Reloaded Network Manager.
Jan 30 04:04:42 np0005601977 systemd[1]: session-12.scope: Deactivated successfully.
Jan 30 04:04:42 np0005601977 systemd[1]: session-12.scope: Consumed 45.203s CPU time.
Jan 30 04:04:42 np0005601977 systemd-logind[809]: Session 12 logged out. Waiting for processes to exit.
Jan 30 04:04:42 np0005601977 systemd-logind[809]: Removed session 12.
Jan 30 04:04:51 np0005601977 systemd-logind[809]: New session 13 of user zuul.
Jan 30 04:04:51 np0005601977 systemd[1]: Started Session 13 of User zuul.
Jan 30 04:04:52 np0005601977 python3.9[59779]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:04:52 np0005601977 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 04:04:53 np0005601977 python3.9[59934]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:04:54 np0005601977 python3.9[60123]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:04:55 np0005601977 systemd[1]: session-13.scope: Deactivated successfully.
Jan 30 04:04:55 np0005601977 systemd[1]: session-13.scope: Consumed 2.119s CPU time.
Jan 30 04:04:55 np0005601977 systemd-logind[809]: Session 13 logged out. Waiting for processes to exit.
Jan 30 04:04:55 np0005601977 systemd-logind[809]: Removed session 13.
Jan 30 04:05:00 np0005601977 systemd-logind[809]: New session 14 of user zuul.
Jan 30 04:05:00 np0005601977 systemd[1]: Started Session 14 of User zuul.
Jan 30 04:05:01 np0005601977 python3.9[60305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:02 np0005601977 python3.9[60459]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:03 np0005601977 python3.9[60615]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:04 np0005601977 python3.9[60700]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:06 np0005601977 python3.9[60853]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:07 np0005601977 python3.9[61044]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:08 np0005601977 python3.9[61196]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:05:08 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:05:09 np0005601977 python3.9[61359]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:09 np0005601977 python3.9[61437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:10 np0005601977 python3.9[61589]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:10 np0005601977 python3.9[61667]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:11 np0005601977 python3.9[61819]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:12 np0005601977 python3.9[61971]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:12 np0005601977 python3.9[62123]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:13 np0005601977 python3.9[62275]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:14 np0005601977 python3.9[62427]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:16 np0005601977 python3.9[62580]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:17 np0005601977 python3.9[62734]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:05:18 np0005601977 python3.9[62886]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:05:19 np0005601977 python3.9[63038]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:05:20 np0005601977 python3.9[63191]: ansible-service_facts Invoked
Jan 30 04:05:20 np0005601977 network[63208]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:05:20 np0005601977 network[63209]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:05:20 np0005601977 network[63210]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:05:26 np0005601977 python3.9[63662]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:29 np0005601977 python3.9[63815]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 30 04:05:30 np0005601977 python3.9[63967]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:31 np0005601977 python3.9[64092]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763929.9353082-651-104936379983961/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:31 np0005601977 python3.9[64246]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:32 np0005601977 python3.9[64371]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763931.3400204-696-87966731232750/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:34 np0005601977 python3.9[64525]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:35 np0005601977 python3.9[64679]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:36 np0005601977 python3.9[64763]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:05:38 np0005601977 python3.9[64917]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:39 np0005601977 python3.9[65001]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:05:39 np0005601977 chronyd[829]: chronyd exiting
Jan 30 04:05:39 np0005601977 systemd[1]: Stopping NTP client/server...
Jan 30 04:05:39 np0005601977 systemd[1]: chronyd.service: Deactivated successfully.
Jan 30 04:05:39 np0005601977 systemd[1]: Stopped NTP client/server.
Jan 30 04:05:39 np0005601977 systemd[1]: Starting NTP client/server...
Jan 30 04:05:39 np0005601977 chronyd[65010]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 30 04:05:39 np0005601977 chronyd[65010]: Frequency -28.553 +/- 0.143 ppm read from /var/lib/chrony/drift
Jan 30 04:05:39 np0005601977 chronyd[65010]: Loaded seccomp filter (level 2)
Jan 30 04:05:39 np0005601977 systemd[1]: Started NTP client/server.
Jan 30 04:05:46 np0005601977 systemd[1]: session-14.scope: Deactivated successfully.
Jan 30 04:05:46 np0005601977 systemd[1]: session-14.scope: Consumed 22.412s CPU time.
Jan 30 04:05:46 np0005601977 systemd-logind[809]: Session 14 logged out. Waiting for processes to exit.
Jan 30 04:05:46 np0005601977 systemd-logind[809]: Removed session 14.
Jan 30 04:05:51 np0005601977 systemd-logind[809]: New session 15 of user zuul.
Jan 30 04:05:51 np0005601977 systemd[1]: Started Session 15 of User zuul.
Jan 30 04:05:52 np0005601977 python3.9[65189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:53 np0005601977 python3.9[65345]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:54 np0005601977 python3.9[65520]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:54 np0005601977 python3.9[65598]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.w5lqspw9 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:56 np0005601977 python3.9[65750]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:56 np0005601977 python3.9[65873]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763955.5401978-138-216612449509012/.source _original_basename=.4f9wn77_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:57 np0005601977 python3.9[66025]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:57 np0005601977 python3.9[66177]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:58 np0005601977 python3.9[66300]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763957.508416-210-148356409048766/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:58 np0005601977 python3.9[66452]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:59 np0005601977 python3.9[66575]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763958.5491629-210-223209824095940/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:06:00 np0005601977 python3.9[66727]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:00 np0005601977 python3.9[66879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:01 np0005601977 python3.9[67002]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763960.3077402-321-153340439868928/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:01 np0005601977 python3.9[67154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:02 np0005601977 python3.9[67277]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763961.415473-366-28825188423659/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:03 np0005601977 python3.9[67429]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:03 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:03 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:03 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:03 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:03 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:03 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:03 np0005601977 systemd[1]: Starting EDPM Container Shutdown...
Jan 30 04:06:03 np0005601977 systemd[1]: Finished EDPM Container Shutdown.
Jan 30 04:06:04 np0005601977 python3.9[67659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:05 np0005601977 python3.9[67782]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763964.0760512-435-61473629558312/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:05 np0005601977 python3.9[67934]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:06 np0005601977 python3.9[68057]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763965.210686-480-170012446258573/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:06 np0005601977 python3.9[68209]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:06 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:06 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:06 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:07 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:07 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:07 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:07 np0005601977 systemd[1]: Starting Create netns directory...
Jan 30 04:06:07 np0005601977 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:06:07 np0005601977 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:06:07 np0005601977 systemd[1]: Finished Create netns directory.
Jan 30 04:06:08 np0005601977 python3.9[68435]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:06:08 np0005601977 network[68452]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:06:08 np0005601977 network[68453]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:06:08 np0005601977 network[68454]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:06:11 np0005601977 python3.9[68716]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:12 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:12 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:12 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:12 np0005601977 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 30 04:06:12 np0005601977 iptables.init[68755]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 30 04:06:12 np0005601977 iptables.init[68755]: iptables: Flushing firewall rules: [  OK  ]
Jan 30 04:06:12 np0005601977 systemd[1]: iptables.service: Deactivated successfully.
Jan 30 04:06:12 np0005601977 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 30 04:06:13 np0005601977 python3.9[68951]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:14 np0005601977 python3.9[69105]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:14 np0005601977 systemd[1]: Reloading.
Jan 30 04:06:14 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:14 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:14 np0005601977 systemd[1]: Starting Netfilter Tables...
Jan 30 04:06:14 np0005601977 systemd[1]: Finished Netfilter Tables.
Jan 30 04:06:15 np0005601977 python3.9[69297]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:16 np0005601977 python3.9[69450]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:16 np0005601977 python3.9[69575]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763975.6539752-687-68242518489699/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:18 np0005601977 python3.9[69728]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:06:18 np0005601977 systemd[1]: Reloading OpenSSH server daemon...
Jan 30 04:06:18 np0005601977 systemd[1]: Reloaded OpenSSH server daemon.
Jan 30 04:06:18 np0005601977 python3.9[69884]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:19 np0005601977 python3.9[70036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:19 np0005601977 python3.9[70159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763979.0351071-780-192310192222062/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:20 np0005601977 python3.9[70311]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 30 04:06:20 np0005601977 systemd[1]: Starting Time & Date Service...
Jan 30 04:06:21 np0005601977 systemd[1]: Started Time & Date Service.
Jan 30 04:06:21 np0005601977 python3.9[70467]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:22 np0005601977 python3.9[70619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:22 np0005601977 python3.9[70742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763981.983112-885-170978487180172/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:23 np0005601977 python3.9[70894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:24 np0005601977 python3.9[71017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763983.1148438-930-78147310660828/.source.yaml _original_basename=.ig4ew9ei follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:24 np0005601977 python3.9[71169]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:25 np0005601977 python3.9[71292]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763984.1943476-975-12882893180284/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:25 np0005601977 python3.9[71444]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:26 np0005601977 python3.9[71597]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:27 np0005601977 python3[71750]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:06:27 np0005601977 python3.9[71902]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:28 np0005601977 python3.9[72025]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763987.358774-1092-178352468946312/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:28 np0005601977 python3.9[72177]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:29 np0005601977 python3.9[72300]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763988.541213-1137-273999061428957/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:30 np0005601977 python3.9[72452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:30 np0005601977 python3.9[72575]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763989.6780972-1182-75704992564399/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:31 np0005601977 python3.9[72727]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:31 np0005601977 python3.9[72850]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763990.7824192-1227-155839983996506/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:32 np0005601977 python3.9[73002]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:32 np0005601977 python3.9[73125]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763991.9469604-1272-235757829698267/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:33 np0005601977 python3.9[73277]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:34 np0005601977 python3.9[73429]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:35 np0005601977 python3.9[73588]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:35 np0005601977 python3.9[73741]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:36 np0005601977 python3.9[73893]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:37 np0005601977 python3.9[74045]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 30 04:06:37 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:06:37 np0005601977 python3.9[74199]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 30 04:06:38 np0005601977 systemd[1]: session-15.scope: Deactivated successfully.
Jan 30 04:06:38 np0005601977 systemd[1]: session-15.scope: Consumed 30.777s CPU time.
Jan 30 04:06:38 np0005601977 systemd-logind[809]: Session 15 logged out. Waiting for processes to exit.
Jan 30 04:06:38 np0005601977 systemd-logind[809]: Removed session 15.
Jan 30 04:06:43 np0005601977 systemd-logind[809]: New session 16 of user zuul.
Jan 30 04:06:43 np0005601977 systemd[1]: Started Session 16 of User zuul.
Jan 30 04:06:44 np0005601977 python3.9[74380]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 30 04:06:45 np0005601977 python3.9[74532]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:06:46 np0005601977 python3.9[74684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:06:47 np0005601977 python3.9[74836]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU0CWF83KfXqSNJSVlw6CgEfyzQlVc+sQ44mcTHx301r7khNpqGqzzMl2b7B3g23zeOn2rm6WkcSpEWp9NX1E4FzlirH03uNb4wmfTil8FM0ijgykv8ayTE+qg8rqjVN/609BkbvtuXEXKvifnL2QLn5d86JoMfCX4sZANxlKS0zIXNNzOBWfSuQwG9mcnFwqhhkxuKK6MvSWgH2by+gVux3vL2E+9Hp4A5jNgMsfbW7Mq2euCLpntWO+yOZTGN9eLLPWiwuU0k+gM8FTa94oNFKnKgCvyJ1Yvg8J93qul2lVDzABn7E2fibM0HHLQtYqyzaUeT5cZ+wj8IAEo8jJAzLVNNiJ1oQMvubGWdIKqe4xqnCPaHoxgm9aZDfmS7jAuvdckG23zZ0JhSsWlrtN7xI9QZ810/hmRpikxGQLwKhiy9r8eoNZfI9KpodKz+Fe4hGBj9Q+HVg+jQ5arbPWHcmHTzigE8RTwfEaquYWbkoTPrSY91r+5IPJhckAV3bk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPwxHcThrtnoKJGePVfRk58vCQwCWYpQ4iFWVlZQL0zh#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMPX+Rlv1VZVTcJHTQQZnMHb8kJKLhFtYoOy8z2Chbt2lwi784rpWFzb7jWcZongf3UHJoSP+5IOd0+d2b54PM0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwf6eb4Ud21yPWubeGZA8uqtixuwZzeZ5UoBadQadVnxVH49Nw/8ibZzt2wsZTIKtbNU2R+eHrDaNVB1QxJpVYXvYDUsG2RWKZCFKz2SZ1tEQam/R+2D1hXGc9qrD7+TEPDx2Wwrc1ss+ednMEHn2Gzy9CEjhMe4wcJF98yV52TUd6QHjOK8V+5pjSX18HaGbYe+l7oUb5mu8HWJkRVT5UlSWHFNksxYtLhhWMLshFBvIFNyHvIyYu6CSVwJJ2u6EbGORY9hnxfgn19lmHuOFr3KM1piYtNbTo6Y0A4ihCObgmnlyjzosioo92t/7T5zL3zC3HRHO8LUIfOvGRqe3ZSDDX7r7vc3u5p4dMxUnqag2BMfwS4Yz5s16GMDMWb1NfzKCsI0RGbIglZ8ZE7HWvULCMQGFbk0oAYI5B4+5b+/1b2ErJHvmhy77YeMDj+ZZkcJBXLVwyCUYYmyarfkr6dDJnuM7fkW5rOnfzcCketCIZCcQnpkBAKV/IudZE1/c=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIL/P3OiVbM10OAPjF9Amd7xZhZLxY4V0EuaAx+wvjSF/#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAV7AMUgUeqiy7iD31QxTURySlKR2fBxMPCblDeBG/dj+f9J1PQ8cZLAV3XTLdCxjk7cuvqZgV97OuUkxYig25g=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCRk+IxZ0/uyz964ul4FIalWLU+RmsFaMHhJTZwt8ByL08m86HRErkmNy7Z6oJd2fxocJ/fCs5OvEyX/ZUPGxV3Mdbt0IwdCMTAb7Z/A4w+ie2xsV62iidAd4Wi03Wd4m0C7UCv3kNtNk8rYqBZ89W5iMmhXXxZNrul7hmbDHACoG3cA9hefsGM0x52FXCZggf/iwYhbSE44ql/H+/TnZedC9ooElCLTzIV9JGpzCvErYQ8RGZ07EOfIRgSZ0Pa7mUwTRKgiGsi2KPaX6MUL4KHPQk3gBiwY38ibUR9mpxEnbCq5FQwaKLar2+KgMZAZ2v+iPXKa4V2nCL0MP8tZrtovt65vorRrmz7oWxyAwu/FMyHS9ogS9yeAH7pRFasZ7cru3FC3mHim5acBZnuiek8BCcTvxdsJFGlYEHwtiUKSUF3nysO3JfWii2Pe4WeVK63UdTfAaGeC7J+AV4a9BFTf98lUdXDM1osJ08Z6pWmnrCxASYHGYogUhwgL0NnRH0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICurNT8B2BrfPznsK5CLFzT9Xwr6Yrnz0KCZMpcruyIL#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF1KO3gRI6MewvlkrVvSUR1n2NF/WbCjfKMKcsninu/Qnl23QC5T9OSewdOY7mdImHiKVFMnjt5d4TIcXgyEQ+I=#012 create=True mode=0644 path=/tmp/ansible.naor4q2b state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:48 np0005601977 python3.9[74988]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.naor4q2b' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:49 np0005601977 python3.9[75142]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.naor4q2b state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:49 np0005601977 systemd[1]: session-16.scope: Deactivated successfully.
Jan 30 04:06:49 np0005601977 systemd[1]: session-16.scope: Consumed 2.971s CPU time.
Jan 30 04:06:49 np0005601977 systemd-logind[809]: Session 16 logged out. Waiting for processes to exit.
Jan 30 04:06:49 np0005601977 systemd-logind[809]: Removed session 16.
Jan 30 04:06:51 np0005601977 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 30 04:06:54 np0005601977 systemd-logind[809]: New session 17 of user zuul.
Jan 30 04:06:54 np0005601977 systemd[1]: Started Session 17 of User zuul.
Jan 30 04:06:55 np0005601977 python3.9[75323]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:06:57 np0005601977 python3.9[75479]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 30 04:06:57 np0005601977 python3.9[75633]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:06:58 np0005601977 python3.9[75786]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:59 np0005601977 python3.9[75939]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:00 np0005601977 python3.9[76093]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:07:01 np0005601977 python3.9[76248]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:01 np0005601977 systemd[1]: session-17.scope: Deactivated successfully.
Jan 30 04:07:01 np0005601977 systemd[1]: session-17.scope: Consumed 3.875s CPU time.
Jan 30 04:07:01 np0005601977 systemd-logind[809]: Session 17 logged out. Waiting for processes to exit.
Jan 30 04:07:01 np0005601977 systemd-logind[809]: Removed session 17.
Jan 30 04:07:06 np0005601977 systemd-logind[809]: New session 18 of user zuul.
Jan 30 04:07:06 np0005601977 systemd[1]: Started Session 18 of User zuul.
Jan 30 04:07:07 np0005601977 python3.9[76427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:07:08 np0005601977 python3.9[76583]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:07:09 np0005601977 python3.9[76667]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:07:11 np0005601977 python3.9[76818]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:07:13 np0005601977 python3.9[76969]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:07:13 np0005601977 python3.9[77119]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:14 np0005601977 python3.9[77269]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:14 np0005601977 systemd[1]: session-18.scope: Deactivated successfully.
Jan 30 04:07:14 np0005601977 systemd[1]: session-18.scope: Consumed 5.301s CPU time.
Jan 30 04:07:14 np0005601977 systemd-logind[809]: Session 18 logged out. Waiting for processes to exit.
Jan 30 04:07:14 np0005601977 systemd-logind[809]: Removed session 18.
Jan 30 04:07:19 np0005601977 systemd-logind[809]: New session 19 of user zuul.
Jan 30 04:07:19 np0005601977 systemd[1]: Started Session 19 of User zuul.
Jan 30 04:07:20 np0005601977 python3.9[77447]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:07:22 np0005601977 python3.9[77603]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:23 np0005601977 python3.9[77755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:24 np0005601977 python3.9[77907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:24 np0005601977 python3.9[78030]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764043.6151261-149-239407586054877/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=97f0fa389d90f12079eee972f0bdf2ee9281e30c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:25 np0005601977 python3.9[78182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:25 np0005601977 python3.9[78305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764044.980201-149-276040060813876/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c4ac8da60190f1ab0f303a8bc23860222db62bf0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:26 np0005601977 python3.9[78457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:27 np0005601977 python3.9[78580]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764046.0778859-149-247080599531576/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=218ea312d48216239170d51e88d56aafdec2f557 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:27 np0005601977 python3.9[78732]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:28 np0005601977 python3.9[78884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:28 np0005601977 python3.9[79036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:29 np0005601977 python3.9[79159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764048.2980955-323-130206136012906/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=61708f26df6a4adebb5cff3600c3f82b420d9ed8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:29 np0005601977 python3.9[79311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:30 np0005601977 python3.9[79434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764049.4043298-323-188195946675332/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=20e5b9de999b0f369806d5bf4ae2497d7bf9e0ce backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:31 np0005601977 python3.9[79586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:31 np0005601977 python3.9[79709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764050.9449-323-117801641472813/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5721eecd2cc7d35c10ef9ad72c794b9b2014a973 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:32 np0005601977 python3.9[79861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:33 np0005601977 python3.9[80013]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:33 np0005601977 python3.9[80165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:34 np0005601977 python3.9[80288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764053.3711889-505-49071796633588/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=17040777f5cd8f85c197b0d92fdce195f483157a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:34 np0005601977 python3.9[80440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:35 np0005601977 python3.9[80563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764054.4509852-505-141798466045918/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ce5169879430fd2d3d983cf42225c6608c73f6d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:35 np0005601977 python3.9[80715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:36 np0005601977 python3.9[80838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764055.4988956-505-240970725112123/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d32f32f9f2939b0d5dd1d696ee1a4df156c83817 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:37 np0005601977 python3.9[80990]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:37 np0005601977 python3.9[81142]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:38 np0005601977 python3.9[81294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:38 np0005601977 python3.9[81417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764057.7338984-672-211161754160101/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bbefd24f9df1b9132ce4cab8f240699a398afe40 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:39 np0005601977 python3.9[81569]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:39 np0005601977 python3.9[81692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764058.7844903-672-261699185201552/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ce5169879430fd2d3d983cf42225c6608c73f6d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:40 np0005601977 python3.9[81844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:41 np0005601977 python3.9[81967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764060.0811896-672-15005196429104/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=52774c0ea03ac1ed82254412aae8283cdca5a8dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:42 np0005601977 python3.9[82119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:43 np0005601977 python3.9[82271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:43 np0005601977 python3.9[82394]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764062.6174424-882-120657915002383/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:44 np0005601977 python3.9[82546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:44 np0005601977 python3.9[82698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:45 np0005601977 python3.9[82821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764064.5240207-963-121834753516579/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:46 np0005601977 python3.9[82973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:47 np0005601977 python3.9[83125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:47 np0005601977 python3.9[83248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764066.5913858-1039-127498421993830/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:48 np0005601977 python3.9[83400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:48 np0005601977 python3.9[83552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:49 np0005601977 chronyd[65010]: Selected source 158.69.193.108 (pool.ntp.org)
Jan 30 04:07:49 np0005601977 python3.9[83675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764068.2743354-1107-45277536783183/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:49 np0005601977 python3.9[83827]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:50 np0005601977 python3.9[83979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:51 np0005601977 python3.9[84102]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764070.4274104-1183-108184516486645/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:52 np0005601977 python3.9[84254]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:52 np0005601977 python3.9[84406]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:53 np0005601977 python3.9[84529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764072.2004912-1253-17699067663502/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:53 np0005601977 python3.9[84681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:54 np0005601977 python3.9[84833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:54 np0005601977 python3.9[84956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764073.7772808-1322-140904811046648/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:55 np0005601977 systemd-logind[809]: Session 19 logged out. Waiting for processes to exit.
Jan 30 04:07:55 np0005601977 systemd[1]: session-19.scope: Deactivated successfully.
Jan 30 04:07:55 np0005601977 systemd[1]: session-19.scope: Consumed 24.538s CPU time.
Jan 30 04:07:55 np0005601977 systemd-logind[809]: Removed session 19.
Jan 30 04:08:01 np0005601977 systemd-logind[809]: New session 20 of user zuul.
Jan 30 04:08:01 np0005601977 systemd[1]: Started Session 20 of User zuul.
Jan 30 04:08:02 np0005601977 python3.9[85134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:03 np0005601977 python3.9[85290]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:04 np0005601977 python3.9[85442]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:05 np0005601977 python3.9[85592]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:06 np0005601977 python3.9[85744]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 30 04:08:08 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 30 04:08:08 np0005601977 python3.9[85900]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:08:09 np0005601977 python3.9[85984]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:08:11 np0005601977 python3.9[86137]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:08:12 np0005601977 python3[86292]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 30 04:08:13 np0005601977 python3.9[86444]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:13 np0005601977 python3.9[86596]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:14 np0005601977 python3.9[86674]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:14 np0005601977 python3.9[86826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:15 np0005601977 python3.9[86904]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ssmhtq3b recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:16 np0005601977 python3.9[87056]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:16 np0005601977 python3.9[87134]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:17 np0005601977 python3.9[87286]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:18 np0005601977 python3[87439]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:08:19 np0005601977 python3.9[87591]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:19 np0005601977 python3.9[87716]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764098.6437693-426-206307635885303/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:20 np0005601977 python3.9[87868]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:20 np0005601977 python3.9[87993]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764099.934523-471-38436926229760/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:21 np0005601977 python3.9[88145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:22 np0005601977 python3.9[88270]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764101.2442994-516-119837004450672/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:22 np0005601977 python3.9[88422]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:23 np0005601977 python3.9[88547]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764102.4828284-561-58126694777126/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:24 np0005601977 python3.9[88699]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:24 np0005601977 python3.9[88824]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764103.7322335-606-241215559692834/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:25 np0005601977 python3.9[88976]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:26 np0005601977 python3.9[89128]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:26 np0005601977 python3.9[89283]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:27 np0005601977 python3.9[89435]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:28 np0005601977 python3.9[89588]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:28 np0005601977 python3.9[89742]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:29 np0005601977 python3.9[89897]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:30 np0005601977 python3.9[90047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:31 np0005601977 python3.9[90200]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:31 np0005601977 ovs-vsctl[90201]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 30 04:08:32 np0005601977 python3.9[90353]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:33 np0005601977 python3.9[90508]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:33 np0005601977 ovs-vsctl[90509]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 30 04:08:33 np0005601977 python3.9[90659]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:34 np0005601977 python3.9[90813]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:35 np0005601977 python3.9[90965]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:35 np0005601977 python3.9[91043]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:36 np0005601977 python3.9[91195]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:36 np0005601977 python3.9[91273]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:37 np0005601977 python3.9[91425]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:37 np0005601977 python3.9[91577]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:38 np0005601977 python3.9[91655]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:38 np0005601977 python3.9[91807]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:39 np0005601977 python3.9[91885]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:39 np0005601977 python3.9[92037]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:39 np0005601977 systemd[1]: Reloading.
Jan 30 04:08:39 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:39 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:40 np0005601977 python3.9[92226]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:41 np0005601977 python3.9[92304]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:41 np0005601977 python3.9[92456]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:42 np0005601977 python3.9[92534]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:43 np0005601977 python3.9[92686]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:43 np0005601977 systemd[1]: Reloading.
Jan 30 04:08:43 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:43 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:43 np0005601977 systemd[1]: Starting Create netns directory...
Jan 30 04:08:43 np0005601977 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:08:43 np0005601977 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:08:43 np0005601977 systemd[1]: Finished Create netns directory.
Jan 30 04:08:44 np0005601977 python3.9[92879]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:44 np0005601977 python3.9[93031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:45 np0005601977 python3.9[93154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764124.3687289-1359-173944296082529/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:46 np0005601977 python3.9[93306]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:47 np0005601977 python3.9[93458]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:47 np0005601977 python3.9[93610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:48 np0005601977 python3.9[93733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764127.4505935-1458-17559063076417/.source.json _original_basename=.f0_wxvmj follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:48 np0005601977 python3.9[93883]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:51 np0005601977 python3.9[94306]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 30 04:08:52 np0005601977 python3.9[94458]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:08:53 np0005601977 python3[94611]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:08:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:53 np0005601977 podman[94647]: 2026-01-30 09:08:53.830672541 +0000 UTC m=+0.049334239 container create 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:08:53 np0005601977 podman[94647]: 2026-01-30 09:08:53.799422409 +0000 UTC m=+0.018084087 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:08:53 np0005601977 python3[94611]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:08:54 np0005601977 python3.9[94836]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:54 np0005601977 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:55 np0005601977 python3.9[94990]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:55 np0005601977 python3.9[95066]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:56 np0005601977 python3.9[95217]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764135.714845-1692-35142114089322/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:56 np0005601977 python3.9[95293]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:08:56 np0005601977 systemd[1]: Reloading.
Jan 30 04:08:57 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:57 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:57 np0005601977 python3.9[95404]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:57 np0005601977 systemd[1]: Reloading.
Jan 30 04:08:57 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:57 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:57 np0005601977 systemd[1]: Starting ovn_controller container...
Jan 30 04:08:58 np0005601977 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 30 04:08:58 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:08:58 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17e49936f334310b6690e67057deb7ad6582ed166397e032d01b89c3defd2f94/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 30 04:08:58 np0005601977 systemd[1]: Started /usr/bin/podman healthcheck run 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851.
Jan 30 04:08:58 np0005601977 podman[95445]: 2026-01-30 09:08:58.077124633 +0000 UTC m=+0.140540163 container init 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + sudo -E kolla_set_configs
Jan 30 04:08:58 np0005601977 podman[95445]: 2026-01-30 09:08:58.102800246 +0000 UTC m=+0.166215756 container start 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:08:58 np0005601977 edpm-start-podman-container[95445]: ovn_controller
Jan 30 04:08:58 np0005601977 systemd[1]: Created slice User Slice of UID 0.
Jan 30 04:08:58 np0005601977 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 30 04:08:58 np0005601977 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 30 04:08:58 np0005601977 systemd[1]: Starting User Manager for UID 0...
Jan 30 04:08:58 np0005601977 edpm-start-podman-container[95444]: Creating additional drop-in dependency for "ovn_controller" (92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851)
Jan 30 04:08:58 np0005601977 podman[95467]: 2026-01-30 09:08:58.165243419 +0000 UTC m=+0.055061993 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:08:58 np0005601977 systemd[1]: 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851-357f9c48ab8a94b6.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:08:58 np0005601977 systemd[1]: 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851-357f9c48ab8a94b6.service: Failed with result 'exit-code'.
Jan 30 04:08:58 np0005601977 systemd[1]: Reloading.
Jan 30 04:08:58 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:58 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:58 np0005601977 systemd[95497]: Queued start job for default target Main User Target.
Jan 30 04:08:58 np0005601977 systemd[95497]: Created slice User Application Slice.
Jan 30 04:08:58 np0005601977 systemd[95497]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 30 04:08:58 np0005601977 systemd[95497]: Started Daily Cleanup of User's Temporary Directories.
Jan 30 04:08:58 np0005601977 systemd[95497]: Reached target Paths.
Jan 30 04:08:58 np0005601977 systemd[95497]: Reached target Timers.
Jan 30 04:08:58 np0005601977 systemd[95497]: Starting D-Bus User Message Bus Socket...
Jan 30 04:08:58 np0005601977 systemd[95497]: Starting Create User's Volatile Files and Directories...
Jan 30 04:08:58 np0005601977 systemd[95497]: Finished Create User's Volatile Files and Directories.
Jan 30 04:08:58 np0005601977 systemd[95497]: Listening on D-Bus User Message Bus Socket.
Jan 30 04:08:58 np0005601977 systemd[95497]: Reached target Sockets.
Jan 30 04:08:58 np0005601977 systemd[95497]: Reached target Basic System.
Jan 30 04:08:58 np0005601977 systemd[95497]: Reached target Main User Target.
Jan 30 04:08:58 np0005601977 systemd[95497]: Startup finished in 109ms.
Jan 30 04:08:58 np0005601977 systemd[1]: Started User Manager for UID 0.
Jan 30 04:08:58 np0005601977 systemd[1]: Started ovn_controller container.
Jan 30 04:08:58 np0005601977 systemd[1]: Started Session c1 of User root.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: INFO:__main__:Validating config file
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: INFO:__main__:Writing out command to execute
Jan 30 04:08:58 np0005601977 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: ++ cat /run_command
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + ARGS=
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + sudo kolla_copy_cacerts
Jan 30 04:08:58 np0005601977 systemd[1]: Started Session c2 of User root.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + [[ ! -n '' ]]
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + . kolla_extend_start
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + umask 0022
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 30 04:08:58 np0005601977 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5148] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5156] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <warn>  [1769764138.5159] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5173] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5184] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 30 04:08:58 np0005601977 kernel: br-int: entered promiscuous mode
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5200] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:08:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:08:58Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5343] manager: (ovn-17e7b0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 30 04:08:58 np0005601977 systemd-udevd[95593]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:08:58 np0005601977 kernel: genev_sys_6081: entered promiscuous mode
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5486] device (genev_sys_6081): carrier: link connected
Jan 30 04:08:58 np0005601977 NetworkManager[55565]: <info>  [1769764138.5491] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 30 04:08:58 np0005601977 systemd-udevd[95596]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:08:59 np0005601977 NetworkManager[55565]: <info>  [1769764139.1339] manager: (ovn-d14b9a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 30 04:08:59 np0005601977 python3.9[95723]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:09:01 np0005601977 python3.9[95875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:01 np0005601977 python3.9[95998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764140.5619748-1827-43002071092515/.source.yaml _original_basename=.j5r7x5r2 follow=False checksum=ec333544c79641cd730121880e32bc9e0db5fd7e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:02 np0005601977 python3.9[96150]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:02 np0005601977 ovs-vsctl[96151]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 30 04:09:03 np0005601977 python3.9[96303]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:03 np0005601977 ovs-vsctl[96305]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 30 04:09:04 np0005601977 python3.9[96458]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:04 np0005601977 ovs-vsctl[96459]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 30 04:09:04 np0005601977 systemd[1]: session-20.scope: Deactivated successfully.
Jan 30 04:09:04 np0005601977 systemd[1]: session-20.scope: Consumed 41.229s CPU time.
Jan 30 04:09:04 np0005601977 systemd-logind[809]: Session 20 logged out. Waiting for processes to exit.
Jan 30 04:09:04 np0005601977 systemd-logind[809]: Removed session 20.
Jan 30 04:09:08 np0005601977 systemd[1]: Stopping User Manager for UID 0...
Jan 30 04:09:08 np0005601977 systemd[95497]: Activating special unit Exit the Session...
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped target Main User Target.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped target Basic System.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped target Paths.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped target Sockets.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped target Timers.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 30 04:09:08 np0005601977 systemd[95497]: Closed D-Bus User Message Bus Socket.
Jan 30 04:09:08 np0005601977 systemd[95497]: Stopped Create User's Volatile Files and Directories.
Jan 30 04:09:08 np0005601977 systemd[95497]: Removed slice User Application Slice.
Jan 30 04:09:08 np0005601977 systemd[95497]: Reached target Shutdown.
Jan 30 04:09:08 np0005601977 systemd[95497]: Finished Exit the Session.
Jan 30 04:09:08 np0005601977 systemd[95497]: Reached target Exit the Session.
Jan 30 04:09:08 np0005601977 systemd[1]: user@0.service: Deactivated successfully.
Jan 30 04:09:08 np0005601977 systemd[1]: Stopped User Manager for UID 0.
Jan 30 04:09:08 np0005601977 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 30 04:09:08 np0005601977 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 30 04:09:08 np0005601977 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 30 04:09:08 np0005601977 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 30 04:09:08 np0005601977 systemd[1]: Removed slice User Slice of UID 0.
Jan 30 04:09:10 np0005601977 systemd-logind[809]: New session 22 of user zuul.
Jan 30 04:09:10 np0005601977 systemd[1]: Started Session 22 of User zuul.
Jan 30 04:09:11 np0005601977 python3.9[96638]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:09:12 np0005601977 python3.9[96794]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:13 np0005601977 python3.9[96946]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:13 np0005601977 python3.9[97098]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:14 np0005601977 python3.9[97250]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:15 np0005601977 python3.9[97402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:15 np0005601977 python3.9[97552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:09:16 np0005601977 python3.9[97705]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 30 04:09:18 np0005601977 python3.9[97855]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:19 np0005601977 python3.9[97976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764157.8856814-213-247667736897870/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:19 np0005601977 python3.9[98126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:20 np0005601977 python3.9[98247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764159.304139-258-151697145453450/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:21 np0005601977 python3.9[98399]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:09:22 np0005601977 python3.9[98483]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:09:24 np0005601977 python3.9[98636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:09:25 np0005601977 python3.9[98789]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:26 np0005601977 python3.9[98910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764165.2262068-369-209766678384265/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:26 np0005601977 python3.9[99060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:27 np0005601977 python3.9[99181]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764166.277351-369-74772596033632/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:28 np0005601977 ovn_controller[95460]: 2026-01-30T09:09:28Z|00025|memory|INFO|16256 kB peak resident set size after 29.9 seconds
Jan 30 04:09:28 np0005601977 ovn_controller[95460]: 2026-01-30T09:09:28Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 30 04:09:28 np0005601977 podman[99305]: 2026-01-30 09:09:28.404443985 +0000 UTC m=+0.106414929 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:09:28 np0005601977 python3.9[99341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:28 np0005601977 python3.9[99477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764168.0945492-501-40954708404444/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:29 np0005601977 python3.9[99627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:30 np0005601977 python3.9[99748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764169.35015-501-83334742440182/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:30 np0005601977 python3.9[99898]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:31 np0005601977 python3.9[100052]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:32 np0005601977 python3.9[100204]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:32 np0005601977 python3.9[100282]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:33 np0005601977 python3.9[100434]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:33 np0005601977 python3.9[100512]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:34 np0005601977 python3.9[100664]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:35 np0005601977 python3.9[100816]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:35 np0005601977 python3.9[100894]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:36 np0005601977 python3.9[101046]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:36 np0005601977 python3.9[101124]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:37 np0005601977 python3.9[101276]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:37 np0005601977 systemd[1]: Reloading.
Jan 30 04:09:37 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:37 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:38 np0005601977 python3.9[101465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:39 np0005601977 python3.9[101543]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:39 np0005601977 python3.9[101695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:40 np0005601977 python3.9[101773]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:40 np0005601977 python3.9[101925]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:40 np0005601977 systemd[1]: Reloading.
Jan 30 04:09:40 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:40 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:41 np0005601977 systemd[1]: Starting Create netns directory...
Jan 30 04:09:41 np0005601977 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:09:41 np0005601977 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:09:41 np0005601977 systemd[1]: Finished Create netns directory.
Jan 30 04:09:42 np0005601977 python3.9[102118]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:42 np0005601977 python3.9[102270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:43 np0005601977 python3.9[102393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764182.2224584-954-65588661059382/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:44 np0005601977 python3.9[102545]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:44 np0005601977 python3.9[102697]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:45 np0005601977 python3.9[102849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:46 np0005601977 python3.9[102972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764185.1547992-1053-245549765365736/.source.json _original_basename=.kiq883s9 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:46 np0005601977 python3.9[103122]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:48 np0005601977 python3.9[103545]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 30 04:09:49 np0005601977 python3.9[103697]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:09:51 np0005601977 python3[103849]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:09:51 np0005601977 podman[103886]: 2026-01-30 09:09:51.341588105 +0000 UTC m=+0.065585693 container create 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:09:51 np0005601977 podman[103886]: 2026-01-30 09:09:51.309045466 +0000 UTC m=+0.033043074 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:09:51 np0005601977 python3[103849]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:09:52 np0005601977 python3.9[104077]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:52 np0005601977 python3.9[104231]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:53 np0005601977 python3.9[104307]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:53 np0005601977 python3.9[104458]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764193.3224838-1287-106105188634243/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:54 np0005601977 python3.9[104534]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:09:54 np0005601977 systemd[1]: Reloading.
Jan 30 04:09:54 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:54 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:55 np0005601977 python3.9[104645]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:55 np0005601977 systemd[1]: Reloading.
Jan 30 04:09:55 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:55 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:55 np0005601977 systemd[1]: Starting ovn_metadata_agent container...
Jan 30 04:09:55 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:09:55 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1789c5946d6844456cc5cfb0048c4eeeb8ab6ae09749a60fc676ef95997fde0f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 30 04:09:55 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1789c5946d6844456cc5cfb0048c4eeeb8ab6ae09749a60fc676ef95997fde0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:09:55 np0005601977 systemd[1]: Started /usr/bin/podman healthcheck run 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d.
Jan 30 04:09:55 np0005601977 podman[104686]: 2026-01-30 09:09:55.674784423 +0000 UTC m=+0.111082402 container init 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + sudo -E kolla_set_configs
Jan 30 04:09:55 np0005601977 podman[104686]: 2026-01-30 09:09:55.701185677 +0000 UTC m=+0.137483656 container start 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:09:55 np0005601977 edpm-start-podman-container[104686]: ovn_metadata_agent
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Validating config file
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Copying service configuration files
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Writing out command to execute
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 30 04:09:55 np0005601977 edpm-start-podman-container[104685]: Creating additional drop-in dependency for "ovn_metadata_agent" (9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d)
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: ++ cat /run_command
Jan 30 04:09:55 np0005601977 podman[104707]: 2026-01-30 09:09:55.759303926 +0000 UTC m=+0.048282069 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + CMD=neutron-ovn-metadata-agent
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + ARGS=
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + sudo kolla_copy_cacerts
Jan 30 04:09:55 np0005601977 systemd[1]: Reloading.
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + [[ ! -n '' ]]
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + . kolla_extend_start
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: Running command: 'neutron-ovn-metadata-agent'
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + umask 0022
Jan 30 04:09:55 np0005601977 ovn_metadata_agent[104701]: + exec neutron-ovn-metadata-agent
Jan 30 04:09:55 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:55 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:55 np0005601977 systemd[1]: Started ovn_metadata_agent container.
Jan 30 04:09:57 np0005601977 python3.9[104941]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.324 104706 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.324 104706 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.324 104706 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.325 104706 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.326 104706 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.327 104706 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.328 104706 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.329 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.330 104706 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.331 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.332 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.333 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.334 104706 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.335 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.336 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.337 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.338 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.339 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.340 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.341 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.342 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.343 104706 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.344 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.345 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.346 104706 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.347 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.348 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.349 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.350 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.351 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.352 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.353 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.354 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.355 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.356 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.357 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.358 104706 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.367 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.367 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.367 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.368 104706 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.368 104706 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.379 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9be64184-856f-4986-a80e-9403fa35a6a5 (UUID: 9be64184-856f-4986-a80e-9403fa35a6a5) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.400 104706 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.400 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.400 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.400 104706 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.404 104706 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.410 104706 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.416 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '9be64184-856f-4986-a80e-9403fa35a6a5'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], external_ids={}, name=9be64184-856f-4986-a80e-9403fa35a6a5, nb_cfg_timestamp=1769764146537, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.417 104706 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f4026cedb50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.418 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.418 104706 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.418 104706 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.419 104706 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.422 104706 DEBUG oslo_service.service [-] Started child 104966 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.424 104706 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpdgql670n/privsep.sock']#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.426 104966 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-242333'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.459 104966 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.460 104966 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.460 104966 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.467 104966 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.480 104966 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:09:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.489 104966 INFO eventlet.wsgi.server [-] (104966) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 30 04:09:57 np0005601977 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.093 104706 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.094 104706 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdgql670n/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.964 105085 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.967 105085 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.968 105085 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:57.969 105085 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105085#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.096 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[b091caea-15f4-4e9b-ae83-7b08cbaec68d]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:09:58 np0005601977 python3.9[105099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.533 105085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.534 105085 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:09:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:58.534 105085 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:09:58 np0005601977 podman[105200]: 2026-01-30 09:09:58.678817719 +0000 UTC m=+0.097014342 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:09:58 np0005601977 python3.9[105236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764197.7549756-1422-122880728412272/.source.yaml _original_basename=.htmzqoz1 follow=False checksum=3b6fe052ce520a89275a36a1ba4ff1848cf43bed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.003 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fafb83-e820-4669-ad80-377fb515f676]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.004 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, column=external_ids, values=({'neutron:ovn-metadata-id': 'fcc79ef3-e886-535e-8727-c194a86eed0f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.017 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.024 104706 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.025 104706 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.025 104706 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.025 104706 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.025 104706 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.025 104706 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.026 104706 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.026 104706 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.026 104706 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.026 104706 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.027 104706 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.027 104706 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.027 104706 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.027 104706 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.028 104706 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.028 104706 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.028 104706 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.029 104706 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.029 104706 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.029 104706 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.029 104706 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.029 104706 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.030 104706 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.030 104706 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.030 104706 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.031 104706 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.031 104706 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.032 104706 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.032 104706 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.032 104706 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.033 104706 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.033 104706 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.033 104706 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.033 104706 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.034 104706 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.034 104706 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.034 104706 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.034 104706 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.035 104706 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.035 104706 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.035 104706 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.036 104706 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.036 104706 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.036 104706 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.036 104706 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.037 104706 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.037 104706 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.037 104706 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.037 104706 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.038 104706 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.038 104706 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.038 104706 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.038 104706 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.038 104706 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.039 104706 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.039 104706 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.039 104706 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.039 104706 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.039 104706 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.040 104706 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.040 104706 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.040 104706 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.040 104706 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.040 104706 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.041 104706 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.041 104706 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.041 104706 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.041 104706 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.042 104706 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.042 104706 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.042 104706 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.042 104706 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.042 104706 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.043 104706 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.043 104706 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.043 104706 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.043 104706 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.044 104706 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.044 104706 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.044 104706 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.044 104706 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.044 104706 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.045 104706 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.045 104706 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.045 104706 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.045 104706 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.045 104706 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.046 104706 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.046 104706 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.046 104706 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.047 104706 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.047 104706 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.047 104706 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.048 104706 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.048 104706 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.048 104706 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.049 104706 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.049 104706 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.049 104706 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.049 104706 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.050 104706 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.050 104706 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.050 104706 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.050 104706 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.051 104706 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.051 104706 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.051 104706 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.051 104706 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.052 104706 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.052 104706 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.052 104706 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.052 104706 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.052 104706 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.053 104706 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.053 104706 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.053 104706 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.053 104706 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.054 104706 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.054 104706 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.054 104706 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.054 104706 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.054 104706 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.055 104706 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.055 104706 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.055 104706 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.055 104706 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.056 104706 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.056 104706 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.056 104706 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.056 104706 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.056 104706 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.057 104706 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.057 104706 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.057 104706 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.057 104706 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.057 104706 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.058 104706 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.058 104706 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.058 104706 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.058 104706 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.059 104706 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.059 104706 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.059 104706 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.059 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.059 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.060 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.060 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.060 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.060 104706 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.060 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.061 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.061 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.061 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.061 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.061 104706 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.062 104706 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.062 104706 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.062 104706 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.062 104706 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.062 104706 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.063 104706 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.063 104706 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.063 104706 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.063 104706 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.064 104706 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.065 104706 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.066 104706 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.067 104706 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.068 104706 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.069 104706 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.070 104706 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.071 104706 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.072 104706 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.073 104706 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.074 104706 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.075 104706 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.076 104706 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.076 104706 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.076 104706 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.076 104706 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.076 104706 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.077 104706 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.078 104706 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.079 104706 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.079 104706 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.079 104706 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.079 104706 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.080 104706 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.080 104706 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.080 104706 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.080 104706 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.080 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.081 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.081 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.081 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.081 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.081 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.082 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.082 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.082 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.082 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.083 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.083 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.083 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.083 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.083 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.084 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.085 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.086 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.086 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.086 104706 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.086 104706 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.086 104706 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.087 104706 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.087 104706 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:09:59.087 104706 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:09:59 np0005601977 systemd[1]: session-22.scope: Deactivated successfully.
Jan 30 04:09:59 np0005601977 systemd[1]: session-22.scope: Consumed 32.486s CPU time.
Jan 30 04:09:59 np0005601977 systemd-logind[809]: Session 22 logged out. Waiting for processes to exit.
Jan 30 04:09:59 np0005601977 systemd-logind[809]: Removed session 22.
Jan 30 04:10:04 np0005601977 systemd-logind[809]: New session 23 of user zuul.
Jan 30 04:10:04 np0005601977 systemd[1]: Started Session 23 of User zuul.
Jan 30 04:10:05 np0005601977 python3.9[105431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:10:06 np0005601977 python3.9[105587]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:08 np0005601977 python3.9[105752]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:10:08 np0005601977 systemd[1]: Reloading.
Jan 30 04:10:08 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:10:08 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:10:09 np0005601977 python3.9[105937]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:10:09 np0005601977 network[105954]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:10:09 np0005601977 network[105955]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:10:09 np0005601977 network[105956]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:10:14 np0005601977 python3.9[106217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:15 np0005601977 python3.9[106370]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:16 np0005601977 python3.9[106523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:16 np0005601977 python3.9[106676]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:17 np0005601977 python3.9[106829]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:18 np0005601977 python3.9[106982]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:18 np0005601977 python3.9[107135]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:19 np0005601977 python3.9[107288]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:20 np0005601977 python3.9[107440]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:21 np0005601977 python3.9[107592]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:21 np0005601977 python3.9[107744]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:22 np0005601977 python3.9[107896]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:22 np0005601977 python3.9[108048]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:23 np0005601977 python3.9[108200]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:24 np0005601977 python3.9[108352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:24 np0005601977 python3.9[108504]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:25 np0005601977 python3.9[108656]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:25 np0005601977 python3.9[108808]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:26 np0005601977 podman[108932]: 2026-01-30 09:10:26.029947365 +0000 UTC m=+0.067468306 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:10:26 np0005601977 python3.9[108971]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:26 np0005601977 python3.9[109132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:27 np0005601977 python3.9[109284]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:28 np0005601977 python3.9[109436]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:28 np0005601977 podman[109515]: 2026-01-30 09:10:28.899143689 +0000 UTC m=+0.108320854 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:10:29 np0005601977 python3.9[109615]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:10:30 np0005601977 python3.9[109767]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:10:30 np0005601977 systemd[1]: Reloading.
Jan 30 04:10:30 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:10:30 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:10:31 np0005601977 python3.9[109954]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:31 np0005601977 python3.9[110107]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:32 np0005601977 python3.9[110260]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:32 np0005601977 python3.9[110413]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:33 np0005601977 python3.9[110566]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:33 np0005601977 python3.9[110719]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:34 np0005601977 python3.9[110872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:35 np0005601977 python3.9[111025]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 30 04:10:36 np0005601977 python3.9[111178]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:10:37 np0005601977 python3.9[111336]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:10:38 np0005601977 python3.9[111496]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:10:39 np0005601977 python3.9[111580]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:10:56 np0005601977 podman[111771]: 2026-01-30 09:10:56.889362343 +0000 UTC m=+0.067360222 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:10:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:10:57.360 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:10:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:10:57.362 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:10:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:10:57.362 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:10:59 np0005601977 podman[111788]: 2026-01-30 09:10:59.892441017 +0000 UTC m=+0.106753541 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:11:01 np0005601977 kernel: SELinux:  Converting 2768 SID table entries...
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:11:01 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  Converting 2768 SID table entries...
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:11:10 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:11:27 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 30 04:11:27 np0005601977 podman[117629]: 2026-01-30 09:11:27.855472232 +0000 UTC m=+0.065089677 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:11:30 np0005601977 podman[120499]: 2026-01-30 09:11:30.842976374 +0000 UTC m=+0.066101125 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:11:52 np0005601977 kernel: SELinux:  Converting 2769 SID table entries...
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:11:52 np0005601977 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:11:54 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 04:11:54 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 30 04:11:54 np0005601977 dbus-broker-launch[783]: Noticed file-system modification, trigger reload.
Jan 30 04:11:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:11:57.363 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:11:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:11:57.364 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:11:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:11:57.364 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:11:57 np0005601977 podman[128896]: 2026-01-30 09:11:57.97011887 +0000 UTC m=+0.077749068 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 30 04:12:01 np0005601977 systemd[1]: Stopping OpenSSH server daemon...
Jan 30 04:12:01 np0005601977 systemd[1]: sshd.service: Deactivated successfully.
Jan 30 04:12:01 np0005601977 systemd[1]: Stopped OpenSSH server daemon.
Jan 30 04:12:01 np0005601977 systemd[1]: sshd.service: Consumed 1.251s CPU time, read 32.0K from disk, written 0B to disk.
Jan 30 04:12:01 np0005601977 systemd[1]: Stopped target sshd-keygen.target.
Jan 30 04:12:01 np0005601977 systemd[1]: Stopping sshd-keygen.target...
Jan 30 04:12:01 np0005601977 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:01 np0005601977 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:01 np0005601977 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:01 np0005601977 systemd[1]: Reached target sshd-keygen.target.
Jan 30 04:12:01 np0005601977 systemd[1]: Starting OpenSSH server daemon...
Jan 30 04:12:01 np0005601977 systemd[1]: Started OpenSSH server daemon.
Jan 30 04:12:01 np0005601977 podman[129531]: 2026-01-30 09:12:01.545891609 +0000 UTC m=+0.124613433 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:12:02 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:12:02 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:12:03 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:03 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:03 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:03 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:12:09 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:12:09 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:12:09 np0005601977 systemd[1]: man-db-cache-update.service: Consumed 7.355s CPU time.
Jan 30 04:12:09 np0005601977 systemd[1]: run-r1665f35ac153420db7591f3d64b2adcf.service: Deactivated successfully.
Jan 30 04:12:24 np0005601977 python3.9[138345]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:24 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:24 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:24 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:26 np0005601977 python3.9[138535]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:26 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:26 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:26 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:26 np0005601977 python3.9[138725]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:27 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:27 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:27 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:27 np0005601977 python3.9[138915]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:28 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:28 np0005601977 podman[138917]: 2026-01-30 09:12:28.088193502 +0000 UTC m=+0.059199583 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:12:28 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:28 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:30 np0005601977 python3.9[139124]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:30 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:30 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:30 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:31 np0005601977 python3.9[139314]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:31 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:31 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:31 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:31 np0005601977 podman[139430]: 2026-01-30 09:12:31.876879215 +0000 UTC m=+0.097909859 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:12:32 np0005601977 python3.9[139531]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:32 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:32 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:32 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:33 np0005601977 python3.9[139721]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:34 np0005601977 python3.9[139876]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:34 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:34 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:34 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:35 np0005601977 python3.9[140069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:35 np0005601977 systemd[1]: Reloading.
Jan 30 04:12:35 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:35 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:35 np0005601977 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 30 04:12:35 np0005601977 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 30 04:12:36 np0005601977 python3.9[140261]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:37 np0005601977 python3.9[140416]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:38 np0005601977 python3.9[140571]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:38 np0005601977 python3.9[140726]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:39 np0005601977 python3.9[140881]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:40 np0005601977 python3.9[141036]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:41 np0005601977 python3.9[141191]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:42 np0005601977 python3.9[141346]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:42 np0005601977 python3.9[141501]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:43 np0005601977 python3.9[141656]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:44 np0005601977 python3.9[141811]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:45 np0005601977 python3.9[141966]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:46 np0005601977 python3.9[142121]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:46 np0005601977 python3.9[142276]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:49 np0005601977 python3.9[142431]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:50 np0005601977 python3.9[142583]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:51 np0005601977 python3.9[142735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:51 np0005601977 python3.9[142887]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:52 np0005601977 python3.9[143039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:52 np0005601977 python3.9[143191]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:53 np0005601977 python3.9[143341]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:12:54 np0005601977 python3.9[143493]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:55 np0005601977 python3.9[143618]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764374.0804026-1641-270502668903608/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:56 np0005601977 python3.9[143770]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:56 np0005601977 python3.9[143895]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764375.7407775-1641-173969725147229/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:57 np0005601977 python3.9[144047]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:12:57.363 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:12:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:12:57.364 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:12:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:12:57.364 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:12:57 np0005601977 python3.9[144172]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764376.9468372-1641-247507864647945/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:58 np0005601977 podman[144296]: 2026-01-30 09:12:58.342376735 +0000 UTC m=+0.076420666 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:12:58 np0005601977 python3.9[144341]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:58 np0005601977 python3.9[144469]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764378.0337908-1641-85503628492921/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:59 np0005601977 python3.9[144621]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:00 np0005601977 python3.9[144746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764379.373032-1641-228135286762761/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:00 np0005601977 python3.9[144898]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:01 np0005601977 python3.9[145023]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764380.4018962-1641-3803696405207/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:01 np0005601977 python3.9[145175]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:02 np0005601977 podman[145270]: 2026-01-30 09:13:02.246340125 +0000 UTC m=+0.078506115 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:13:02 np0005601977 python3.9[145313]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764381.4595726-1641-89692945025204/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:02 np0005601977 python3.9[145476]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:03 np0005601977 python3.9[145601]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764382.4859571-1641-120454336911845/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:04 np0005601977 python3.9[145753]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 30 04:13:05 np0005601977 python3.9[145906]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:06 np0005601977 python3.9[146058]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:06 np0005601977 python3.9[146210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:07 np0005601977 python3.9[146362]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:07 np0005601977 python3.9[146514]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:08 np0005601977 python3.9[146666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:09 np0005601977 python3.9[146818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:09 np0005601977 python3.9[146970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:10 np0005601977 python3.9[147122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:10 np0005601977 python3.9[147274]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:11 np0005601977 python3.9[147426]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:11 np0005601977 python3.9[147578]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:12 np0005601977 python3.9[147730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:12 np0005601977 python3.9[147882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:14 np0005601977 python3.9[148034]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:15 np0005601977 python3.9[148157]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764393.8736134-2304-232383946226069/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:15 np0005601977 python3.9[148309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:16 np0005601977 python3.9[148432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764395.227129-2304-114362394051336/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:16 np0005601977 python3.9[148584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:17 np0005601977 python3.9[148707]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764396.323089-2304-140976476207065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:18 np0005601977 python3.9[148859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:18 np0005601977 python3.9[148982]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764397.5675423-2304-71601897092971/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:19 np0005601977 python3.9[149136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:19 np0005601977 python3.9[149259]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764398.6814766-2304-58433486943940/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:20 np0005601977 python3.9[149411]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:21 np0005601977 python3.9[149534]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764400.124065-2304-15420940755996/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:21 np0005601977 python3.9[149686]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:22 np0005601977 python3.9[149809]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764401.330598-2304-264033102143048/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:22 np0005601977 python3.9[149961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:23 np0005601977 python3.9[150084]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764402.4764545-2304-2730640292673/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:23 np0005601977 python3.9[150236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:24 np0005601977 python3.9[150359]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764403.5094273-2304-255911603810624/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:25 np0005601977 python3.9[150511]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:25 np0005601977 python3.9[150634]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764404.5588157-2304-28420770780251/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:26 np0005601977 python3.9[150786]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:26 np0005601977 python3.9[150909]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764405.8294232-2304-79523565788152/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:27 np0005601977 python3.9[151061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:27 np0005601977 python3.9[151184]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764406.9644282-2304-279826973634514/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:28 np0005601977 python3.9[151336]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:28 np0005601977 podman[151431]: 2026-01-30 09:13:28.814980538 +0000 UTC m=+0.061353530 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:13:28 np0005601977 python3.9[151478]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764407.9968348-2304-280059883417155/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:29 np0005601977 python3.9[151631]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:30 np0005601977 python3.9[151754]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764409.1156402-2304-120648565079353/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:31 np0005601977 python3.9[151904]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:13:32 np0005601977 podman[152031]: 2026-01-30 09:13:32.482881148 +0000 UTC m=+0.062740699 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 30 04:13:32 np0005601977 python3.9[152083]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 30 04:13:34 np0005601977 dbus-broker-launch[790]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 30 04:13:34 np0005601977 python3.9[152240]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:35 np0005601977 python3.9[152392]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:35 np0005601977 python3.9[152544]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:36 np0005601977 python3.9[152696]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:37 np0005601977 python3.9[152848]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:38 np0005601977 python3.9[153000]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:38 np0005601977 python3.9[153152]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:39 np0005601977 python3.9[153304]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:40 np0005601977 python3.9[153456]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:40 np0005601977 python3.9[153608]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:42 np0005601977 python3.9[153760]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:42 np0005601977 systemd[1]: Reloading.
Jan 30 04:13:42 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:42 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:42 np0005601977 systemd[1]: Starting libvirt logging daemon socket...
Jan 30 04:13:42 np0005601977 systemd[1]: Listening on libvirt logging daemon socket.
Jan 30 04:13:42 np0005601977 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 30 04:13:42 np0005601977 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 30 04:13:42 np0005601977 systemd[1]: Starting libvirt logging daemon...
Jan 30 04:13:42 np0005601977 systemd[1]: Started libvirt logging daemon.
Jan 30 04:13:43 np0005601977 python3.9[153953]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:43 np0005601977 systemd[1]: Reloading.
Jan 30 04:13:43 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:43 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:43 np0005601977 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 30 04:13:43 np0005601977 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 30 04:13:43 np0005601977 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 30 04:13:43 np0005601977 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 30 04:13:43 np0005601977 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 30 04:13:43 np0005601977 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 30 04:13:43 np0005601977 systemd[1]: Starting libvirt nodedev daemon...
Jan 30 04:13:43 np0005601977 systemd[1]: Started libvirt nodedev daemon.
Jan 30 04:13:44 np0005601977 python3.9[154169]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:44 np0005601977 systemd[1]: Reloading.
Jan 30 04:13:44 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:44 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:44 np0005601977 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 30 04:13:44 np0005601977 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 30 04:13:44 np0005601977 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 30 04:13:44 np0005601977 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 30 04:13:44 np0005601977 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 30 04:13:44 np0005601977 systemd[1]: Starting libvirt proxy daemon...
Jan 30 04:13:44 np0005601977 systemd[1]: Started libvirt proxy daemon.
Jan 30 04:13:44 np0005601977 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 30 04:13:44 np0005601977 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 30 04:13:44 np0005601977 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 30 04:13:45 np0005601977 python3.9[154384]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:45 np0005601977 systemd[1]: Reloading.
Jan 30 04:13:45 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:45 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:45 np0005601977 systemd[1]: Listening on libvirt locking daemon socket.
Jan 30 04:13:45 np0005601977 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 30 04:13:45 np0005601977 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 30 04:13:45 np0005601977 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 30 04:13:45 np0005601977 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 30 04:13:45 np0005601977 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 30 04:13:45 np0005601977 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 30 04:13:45 np0005601977 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 30 04:13:45 np0005601977 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 30 04:13:45 np0005601977 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 30 04:13:45 np0005601977 systemd[1]: Starting libvirt QEMU daemon...
Jan 30 04:13:45 np0005601977 systemd[1]: Started libvirt QEMU daemon.
Jan 30 04:13:45 np0005601977 setroubleshoot[154207]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e384a0bc-a2d9-4ade-a2a9-42c0d4537c76
Jan 30 04:13:45 np0005601977 setroubleshoot[154207]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 30 04:13:45 np0005601977 setroubleshoot[154207]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e384a0bc-a2d9-4ade-a2a9-42c0d4537c76
Jan 30 04:13:45 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:13:45 np0005601977 setroubleshoot[154207]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 30 04:13:46 np0005601977 python3.9[154607]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:46 np0005601977 systemd[1]: Reloading.
Jan 30 04:13:46 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:46 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:46 np0005601977 systemd[1]: Starting libvirt secret daemon socket...
Jan 30 04:13:46 np0005601977 systemd[1]: Listening on libvirt secret daemon socket.
Jan 30 04:13:46 np0005601977 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 30 04:13:46 np0005601977 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 30 04:13:46 np0005601977 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 30 04:13:46 np0005601977 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 30 04:13:46 np0005601977 systemd[1]: Starting libvirt secret daemon...
Jan 30 04:13:46 np0005601977 systemd[1]: Started libvirt secret daemon.
Jan 30 04:13:47 np0005601977 python3.9[154820]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:48 np0005601977 python3.9[154972]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:13:49 np0005601977 python3.9[155124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:50 np0005601977 python3.9[155247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764429.4661722-3339-180346202876438/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:51 np0005601977 python3.9[155399]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:51 np0005601977 python3.9[155551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:52 np0005601977 python3.9[155629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:52 np0005601977 python3.9[155781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:53 np0005601977 python3.9[155859]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.m94mj1tb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:54 np0005601977 python3.9[156011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:54 np0005601977 python3.9[156089]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:55 np0005601977 python3.9[156241]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:13:55 np0005601977 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 30 04:13:55 np0005601977 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 30 04:13:56 np0005601977 python3[156394]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:13:56 np0005601977 python3.9[156546]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:57 np0005601977 python3.9[156624]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:13:57.365 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:13:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:13:57.366 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:13:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:13:57.366 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:13:57 np0005601977 python3.9[156776]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:58 np0005601977 python3.9[156901]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764437.4412208-3606-182590679395601/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:58 np0005601977 podman[157025]: 2026-01-30 09:13:58.970898502 +0000 UTC m=+0.055219658 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:13:59 np0005601977 python3.9[157070]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:59 np0005601977 python3.9[157150]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:00 np0005601977 python3.9[157302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:00 np0005601977 python3.9[157380]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:01 np0005601977 python3.9[157532]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:02 np0005601977 python3.9[157657]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764441.1651113-3723-26481431733745/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:02 np0005601977 podman[157781]: 2026-01-30 09:14:02.865367959 +0000 UTC m=+0.166801934 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:14:02 np0005601977 python3.9[157829]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:03 np0005601977 python3.9[157987]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:04 np0005601977 python3.9[158142]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:05 np0005601977 python3.9[158294]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:05 np0005601977 python3.9[158447]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:06 np0005601977 python3.9[158601]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:07 np0005601977 python3.9[158756]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:08 np0005601977 python3.9[158908]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:08 np0005601977 python3.9[159031]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764447.651231-3939-258400617675140/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:09 np0005601977 python3.9[159183]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:09 np0005601977 python3.9[159306]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764448.9032311-3984-260100401783355/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:10 np0005601977 python3.9[159458]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:11 np0005601977 python3.9[159581]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764450.1515377-4029-195360727490882/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:11 np0005601977 python3.9[159733]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:11 np0005601977 systemd[1]: Reloading.
Jan 30 04:14:11 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:11 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:12 np0005601977 systemd[1]: Reached target edpm_libvirt.target.
Jan 30 04:14:12 np0005601977 python3.9[159924]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 30 04:14:12 np0005601977 systemd[1]: Reloading.
Jan 30 04:14:13 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:13 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:13 np0005601977 systemd[1]: Reloading.
Jan 30 04:14:13 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:13 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:13 np0005601977 systemd[1]: session-23.scope: Deactivated successfully.
Jan 30 04:14:13 np0005601977 systemd[1]: session-23.scope: Consumed 2min 50.118s CPU time.
Jan 30 04:14:13 np0005601977 systemd-logind[809]: Session 23 logged out. Waiting for processes to exit.
Jan 30 04:14:13 np0005601977 systemd-logind[809]: Removed session 23.
Jan 30 04:14:20 np0005601977 systemd-logind[809]: New session 24 of user zuul.
Jan 30 04:14:20 np0005601977 systemd[1]: Started Session 24 of User zuul.
Jan 30 04:14:21 np0005601977 python3.9[160172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:14:22 np0005601977 python3.9[160326]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:14:22 np0005601977 network[160343]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:14:22 np0005601977 network[160344]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:14:22 np0005601977 network[160345]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:14:28 np0005601977 python3.9[160616]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:14:29 np0005601977 python3.9[160700]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:14:29 np0005601977 podman[160702]: 2026-01-30 09:14:29.884443769 +0000 UTC m=+0.095496970 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:14:33 np0005601977 podman[160722]: 2026-01-30 09:14:33.833318488 +0000 UTC m=+0.056105019 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:14:34 np0005601977 python3.9[160899]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:35 np0005601977 python3.9[161051]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:36 np0005601977 python3.9[161204]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:37 np0005601977 python3.9[161356]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:37 np0005601977 python3.9[161509]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:38 np0005601977 python3.9[161632]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764477.5651796-240-65991160258397/.source.iscsi _original_basename=.w81eqexr follow=False checksum=e9c361f5ff799a7cb9450657312add397a9bfa06 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:39 np0005601977 python3.9[161784]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:40 np0005601977 python3.9[161936]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:41 np0005601977 python3.9[162088]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:41 np0005601977 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 30 04:14:42 np0005601977 python3.9[162244]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:42 np0005601977 systemd[1]: Reloading.
Jan 30 04:14:42 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:42 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:42 np0005601977 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 30 04:14:42 np0005601977 systemd[1]: Starting Open-iSCSI...
Jan 30 04:14:42 np0005601977 kernel: Loading iSCSI transport class v2.0-870.
Jan 30 04:14:42 np0005601977 systemd[1]: Started Open-iSCSI.
Jan 30 04:14:42 np0005601977 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 30 04:14:42 np0005601977 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 30 04:14:43 np0005601977 python3.9[162443]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:14:43 np0005601977 network[162460]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:14:43 np0005601977 network[162461]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:14:43 np0005601977 network[162462]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:14:47 np0005601977 python3.9[162733]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:14:50 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:14:50 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:14:50 np0005601977 systemd[1]: Reloading.
Jan 30 04:14:50 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:50 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:50 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:14:50 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:14:50 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:14:50 np0005601977 systemd[1]: run-r1edd5715fb97426ca9c523d651c4c233.service: Deactivated successfully.
Jan 30 04:14:52 np0005601977 python3.9[163051]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 30 04:14:53 np0005601977 python3.9[163203]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 30 04:14:54 np0005601977 python3.9[163359]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:54 np0005601977 python3.9[163482]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764493.6855085-504-174760966419777/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:55 np0005601977 python3.9[163634]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:56 np0005601977 python3.9[163786]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:14:56 np0005601977 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 30 04:14:56 np0005601977 systemd[1]: Stopped Load Kernel Modules.
Jan 30 04:14:56 np0005601977 systemd[1]: Stopping Load Kernel Modules...
Jan 30 04:14:56 np0005601977 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:14:56 np0005601977 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:14:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:14:57.366 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:14:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:14:57.367 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:14:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:14:57.368 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:14:57 np0005601977 python3.9[163942]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:58 np0005601977 python3.9[164095]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:59 np0005601977 python3.9[164247]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:59 np0005601977 python3.9[164370]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764498.8190186-657-219838090489555/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:00 np0005601977 podman[164494]: 2026-01-30 09:15:00.403905594 +0000 UTC m=+0.048502002 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 30 04:15:00 np0005601977 python3.9[164539]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:01 np0005601977 python3.9[164694]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:02 np0005601977 python3.9[164846]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:02 np0005601977 python3.9[164998]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:03 np0005601977 python3.9[165150]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:03 np0005601977 podman[165302]: 2026-01-30 09:15:03.944419316 +0000 UTC m=+0.084740534 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:15:04 np0005601977 python3.9[165303]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:04 np0005601977 python3.9[165480]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:05 np0005601977 python3.9[165632]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:05 np0005601977 python3.9[165784]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:15:06 np0005601977 python3.9[165938]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:07 np0005601977 python3.9[166091]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:07 np0005601977 systemd[1]: Listening on multipathd control socket.
Jan 30 04:15:08 np0005601977 python3.9[166247]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:08 np0005601977 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 30 04:15:08 np0005601977 udevadm[166252]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 30 04:15:08 np0005601977 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 30 04:15:08 np0005601977 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 30 04:15:08 np0005601977 multipathd[166256]: --------start up--------
Jan 30 04:15:08 np0005601977 multipathd[166256]: read /etc/multipath.conf
Jan 30 04:15:08 np0005601977 multipathd[166256]: path checkers start up
Jan 30 04:15:08 np0005601977 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 30 04:15:09 np0005601977 python3.9[166415]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 30 04:15:10 np0005601977 python3.9[166567]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 30 04:15:10 np0005601977 kernel: Key type psk registered
Jan 30 04:15:11 np0005601977 python3.9[166730]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:15:12 np0005601977 python3.9[166853]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764511.1555502-1047-111190211965533/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:12 np0005601977 python3.9[167005]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:13 np0005601977 python3.9[167157]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:13 np0005601977 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 30 04:15:13 np0005601977 systemd[1]: Stopped Load Kernel Modules.
Jan 30 04:15:13 np0005601977 systemd[1]: Stopping Load Kernel Modules...
Jan 30 04:15:13 np0005601977 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:15:13 np0005601977 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:15:14 np0005601977 python3.9[167313]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:15:16 np0005601977 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:17 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601977 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:17 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601977 systemd-logind[809]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 30 04:15:17 np0005601977 systemd-logind[809]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 30 04:15:17 np0005601977 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:15:17 np0005601977 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:15:17 np0005601977 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:17 np0005601977 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:15:18 np0005601977 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:15:18 np0005601977 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:15:18 np0005601977 systemd[1]: man-db-cache-update.service: Consumed 1.118s CPU time.
Jan 30 04:15:18 np0005601977 systemd[1]: run-r448a25e3429a4728827fc01c1e229b87.service: Deactivated successfully.
Jan 30 04:15:20 np0005601977 python3.9[168777]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:20 np0005601977 systemd[1]: Stopping Open-iSCSI...
Jan 30 04:15:20 np0005601977 iscsid[162285]: iscsid shutting down.
Jan 30 04:15:20 np0005601977 systemd[1]: iscsid.service: Deactivated successfully.
Jan 30 04:15:20 np0005601977 systemd[1]: Stopped Open-iSCSI.
Jan 30 04:15:20 np0005601977 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 30 04:15:20 np0005601977 systemd[1]: Starting Open-iSCSI...
Jan 30 04:15:20 np0005601977 systemd[1]: Started Open-iSCSI.
Jan 30 04:15:21 np0005601977 python3.9[168933]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:21 np0005601977 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 30 04:15:21 np0005601977 multipathd[166256]: exit (signal)
Jan 30 04:15:21 np0005601977 multipathd[166256]: --------shut down-------
Jan 30 04:15:21 np0005601977 systemd[1]: multipathd.service: Deactivated successfully.
Jan 30 04:15:21 np0005601977 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 30 04:15:21 np0005601977 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 30 04:15:21 np0005601977 multipathd[168939]: --------start up--------
Jan 30 04:15:21 np0005601977 multipathd[168939]: read /etc/multipath.conf
Jan 30 04:15:21 np0005601977 multipathd[168939]: path checkers start up
Jan 30 04:15:21 np0005601977 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 30 04:15:22 np0005601977 python3.9[169096]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:15:23 np0005601977 python3.9[169252]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:24 np0005601977 python3.9[169404]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:15:24 np0005601977 systemd[1]: Reloading.
Jan 30 04:15:24 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:24 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:25 np0005601977 python3.9[169590]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:15:25 np0005601977 network[169607]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:15:25 np0005601977 network[169608]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:15:25 np0005601977 network[169609]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:15:29 np0005601977 python3.9[169881]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:30 np0005601977 python3.9[170034]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:30 np0005601977 podman[170135]: 2026-01-30 09:15:30.847936931 +0000 UTC m=+0.057853755 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 30 04:15:31 np0005601977 python3.9[170206]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:31 np0005601977 python3.9[170359]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:32 np0005601977 python3.9[170512]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:33 np0005601977 python3.9[170665]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:34 np0005601977 python3.9[170818]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:34 np0005601977 podman[170820]: 2026-01-30 09:15:34.223925157 +0000 UTC m=+0.094472932 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:15:34 np0005601977 python3.9[170999]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:37 np0005601977 python3.9[171152]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:38 np0005601977 python3.9[171304]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:38 np0005601977 python3.9[171456]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:39 np0005601977 python3.9[171608]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:39 np0005601977 python3.9[171760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:40 np0005601977 python3.9[171912]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:40 np0005601977 python3.9[172064]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:41 np0005601977 python3.9[172216]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:42 np0005601977 python3.9[172368]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:43 np0005601977 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 30 04:15:43 np0005601977 python3.9[172520]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:44 np0005601977 python3.9[172673]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:44 np0005601977 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 30 04:15:44 np0005601977 python3.9[172826]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:45 np0005601977 python3.9[172978]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:45 np0005601977 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 30 04:15:45 np0005601977 python3.9[173131]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:46 np0005601977 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 30 04:15:46 np0005601977 python3.9[173283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:47 np0005601977 python3.9[173436]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:48 np0005601977 python3.9[173588]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:49 np0005601977 python3.9[173740]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:15:50 np0005601977 python3.9[173892]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:15:50 np0005601977 systemd[1]: Reloading.
Jan 30 04:15:50 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:50 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:52 np0005601977 python3.9[174079]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:52 np0005601977 python3.9[174232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:53 np0005601977 python3.9[174385]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:53 np0005601977 python3.9[174538]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:54 np0005601977 python3.9[174691]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:54 np0005601977 python3.9[174844]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:55 np0005601977 python3.9[174997]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:55 np0005601977 python3.9[175150]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:15:57.367 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:15:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:15:57.368 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:15:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:15:57.368 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:15:57 np0005601977 python3.9[175303]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:58 np0005601977 python3.9[175455]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:58 np0005601977 python3.9[175607]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:59 np0005601977 python3.9[175759]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:00 np0005601977 python3.9[175911]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:00 np0005601977 python3.9[176063]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:01 np0005601977 podman[176187]: 2026-01-30 09:16:01.2811265 +0000 UTC m=+0.087716909 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:16:01 np0005601977 python3.9[176225]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:01 np0005601977 python3.9[176388]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:02 np0005601977 python3.9[176540]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:02 np0005601977 python3.9[176692]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:04 np0005601977 podman[176717]: 2026-01-30 09:16:04.867930072 +0000 UTC m=+0.062953131 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 30 04:16:08 np0005601977 python3.9[176870]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 30 04:16:10 np0005601977 python3.9[177023]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:16:12 np0005601977 python3.9[177181]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:16:20 np0005601977 systemd-logind[809]: New session 25 of user zuul.
Jan 30 04:16:20 np0005601977 systemd[1]: Started Session 25 of User zuul.
Jan 30 04:16:20 np0005601977 systemd[1]: session-25.scope: Deactivated successfully.
Jan 30 04:16:20 np0005601977 systemd-logind[809]: Session 25 logged out. Waiting for processes to exit.
Jan 30 04:16:20 np0005601977 systemd-logind[809]: Removed session 25.
Jan 30 04:16:20 np0005601977 python3.9[177367]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:21 np0005601977 python3.9[177488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764580.4528809-2634-253543546992970/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:22 np0005601977 python3.9[177638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:22 np0005601977 python3.9[177714]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:23 np0005601977 python3.9[177864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:23 np0005601977 python3.9[177985]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764582.7480958-2634-2462710127340/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:24 np0005601977 python3.9[178135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:24 np0005601977 python3.9[178256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764583.7707257-2634-96522210869672/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:25 np0005601977 python3.9[178406]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:25 np0005601977 python3.9[178527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764584.732422-2634-140821897336011/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:26 np0005601977 python3.9[178677]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:26 np0005601977 python3.9[178798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764585.7785149-2634-267220509337767/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:27 np0005601977 python3.9[178950]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:28 np0005601977 python3.9[179102]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:28 np0005601977 python3.9[179254]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:29 np0005601977 python3.9[179406]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:30 np0005601977 python3.9[179529]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769764589.2227871-2955-56321329781005/.source _original_basename=.wds2ga4e follow=False checksum=1a722b4cc7d4183d1e68dd5e572a9ce1698b87a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 30 04:16:31 np0005601977 python3.9[179681]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:31 np0005601977 podman[179807]: 2026-01-30 09:16:31.648210343 +0000 UTC m=+0.063366433 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:16:31 np0005601977 python3.9[179844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:32 np0005601977 python3.9[179973]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764591.3514209-3033-51426888423271/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:33 np0005601977 python3.9[180123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:33 np0005601977 python3.9[180244]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764592.6584783-3078-127640064547314/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:34 np0005601977 python3.9[180396]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 30 04:16:35 np0005601977 podman[180520]: 2026-01-30 09:16:35.54760607 +0000 UTC m=+0.157125722 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:16:35 np0005601977 python3.9[180563]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:16:36 np0005601977 python3[180729]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:16:36 np0005601977 podman[180767]: 2026-01-30 09:16:36.844961016 +0000 UTC m=+0.056625510 container create e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251202, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:16:36 np0005601977 podman[180767]: 2026-01-30 09:16:36.814489995 +0000 UTC m=+0.026154549 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:16:36 np0005601977 python3[180729]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 30 04:16:37 np0005601977 python3.9[180957]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:38 np0005601977 python3.9[181111]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 30 04:16:39 np0005601977 python3.9[181263]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:16:40 np0005601977 python3[181415]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:16:40 np0005601977 podman[181451]: 2026-01-30 09:16:40.69349965 +0000 UTC m=+0.046199291 container create 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 30 04:16:40 np0005601977 podman[181451]: 2026-01-30 09:16:40.670719199 +0000 UTC m=+0.023418860 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:16:40 np0005601977 python3[181415]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 30 04:16:41 np0005601977 python3.9[181642]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:42 np0005601977 python3.9[181796]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:42 np0005601977 python3.9[181947]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764602.48074-3366-260961458337937/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:43 np0005601977 python3.9[182023]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:16:43 np0005601977 systemd[1]: Reloading.
Jan 30 04:16:43 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:16:43 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:16:44 np0005601977 python3.9[182136]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:16:44 np0005601977 systemd[1]: Reloading.
Jan 30 04:16:44 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:16:44 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:16:44 np0005601977 systemd[1]: Starting nova_compute container...
Jan 30 04:16:44 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:16:44 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:44 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:44 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:44 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:44 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:44 np0005601977 podman[182175]: 2026-01-30 09:16:44.676599241 +0000 UTC m=+0.101933865 container init 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute)
Jan 30 04:16:44 np0005601977 podman[182175]: 2026-01-30 09:16:44.683459157 +0000 UTC m=+0.108793731 container start 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3)
Jan 30 04:16:44 np0005601977 podman[182175]: nova_compute
Jan 30 04:16:44 np0005601977 systemd[1]: Started nova_compute container.
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + sudo -E kolla_set_configs
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Validating config file
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying service configuration files
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Deleting /etc/ceph
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Creating directory /etc/ceph
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /etc/ceph
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Writing out command to execute
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:44 np0005601977 nova_compute[182190]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:44 np0005601977 nova_compute[182190]: ++ cat /run_command
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + CMD=nova-compute
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + ARGS=
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + sudo kolla_copy_cacerts
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + [[ ! -n '' ]]
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + . kolla_extend_start
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + echo 'Running command: '\''nova-compute'\'''
Jan 30 04:16:44 np0005601977 nova_compute[182190]: Running command: 'nova-compute'
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + umask 0022
Jan 30 04:16:44 np0005601977 nova_compute[182190]: + exec nova-compute
Jan 30 04:16:46 np0005601977 python3.9[182352]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.519 182194 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.519 182194 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.519 182194 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.519 182194 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.651 182194 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.660 182194 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:16:46 np0005601977 nova_compute[182190]: 2026-01-30 09:16:46.660 182194 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:16:47 np0005601977 python3.9[182506]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.295 182194 INFO nova.virt.driver [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.380 182194 INFO nova.compute.provider_config [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.399 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.400 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.400 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.400 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.401 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.402 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.403 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.404 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.405 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.406 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.407 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.408 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.409 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.410 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.411 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.412 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.413 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.414 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.415 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.416 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.417 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.418 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.419 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.420 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.421 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.422 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.423 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.424 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.425 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.426 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.427 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.428 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.429 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.430 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.431 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.432 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.433 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.434 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.435 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.436 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.437 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.438 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.439 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.440 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.441 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.442 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.443 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.443 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.443 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.443 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.444 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.445 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.445 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.445 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.445 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.445 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.446 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.447 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.448 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.449 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.450 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.451 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.452 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.453 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.453 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.453 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.453 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.453 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.454 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.455 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.456 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.457 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.458 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.459 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.460 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.461 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.462 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.463 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.464 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.465 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.466 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.467 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.468 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.469 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.470 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.471 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.472 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.473 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.474 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.474 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.474 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.474 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.474 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.475 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.475 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.475 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.475 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.475 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.476 182194 WARNING oslo_config.cfg [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 30 04:16:47 np0005601977 nova_compute[182190]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 30 04:16:47 np0005601977 nova_compute[182190]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 30 04:16:47 np0005601977 nova_compute[182190]: and ``live_migration_inbound_addr`` respectively.
Jan 30 04:16:47 np0005601977 nova_compute[182190]: ).  Its value may be silently ignored in the future.#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.476 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.476 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.476 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.476 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.477 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.478 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.479 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.480 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.481 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.482 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.483 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.484 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.485 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.486 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.487 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.488 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.489 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.490 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.490 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.490 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.490 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.490 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.491 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.492 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.493 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.494 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.495 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.496 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.497 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.498 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.498 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.498 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.498 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.498 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.499 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.500 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.501 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.501 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.501 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.501 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.501 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.502 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.503 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.504 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.505 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.505 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.505 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.505 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.506 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.507 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.508 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.508 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.508 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.508 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.508 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.509 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.510 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.511 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.512 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.513 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.514 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.514 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.514 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.514 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.514 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.515 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.515 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.515 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.515 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.515 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.516 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.516 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.516 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.516 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.516 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.517 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.518 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.519 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.520 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.521 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.522 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.523 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.524 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.525 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.526 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.527 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.528 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.529 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.530 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.531 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.532 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.533 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.534 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.535 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.536 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.537 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.538 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.539 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.540 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.541 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.542 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.543 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.543 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.543 182194 DEBUG oslo_service.service [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.545 182194 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.562 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.563 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.563 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.563 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 30 04:16:47 np0005601977 systemd[1]: Starting libvirt QEMU daemon...
Jan 30 04:16:47 np0005601977 systemd[1]: Started libvirt QEMU daemon.
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.623 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f878c64deb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.626 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f878c64deb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.627 182194 INFO nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.653 182194 WARNING nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Jan 30 04:16:47 np0005601977 nova_compute[182190]: 2026-01-30 09:16:47.653 182194 DEBUG nova.virt.libvirt.volume.mount [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 30 04:16:47 np0005601977 python3.9[182708]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.458 182194 INFO nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt host capabilities <capabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <host>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <uuid>84994b48-1455-435f-a6fe-1797df140bfa</uuid>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <arch>x86_64</arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model>EPYC-Rome-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <vendor>AMD</vendor>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <microcode version='16777317'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <signature family='23' model='49' stepping='0'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='x2apic'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='tsc-deadline'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='osxsave'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='hypervisor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='tsc_adjust'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='spec-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='stibp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='arch-capabilities'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='cmp_legacy'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='topoext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='virt-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='lbrv'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='tsc-scale'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='vmcb-clean'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='pause-filter'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='pfthreshold'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='svme-addr-chk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='rdctl-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='skip-l1dfl-vmentry'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='mds-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature name='pschange-mc-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <pages unit='KiB' size='4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <pages unit='KiB' size='2048'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <pages unit='KiB' size='1048576'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <power_management>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <suspend_mem/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <suspend_disk/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <suspend_hybrid/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </power_management>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <iommu support='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <migration_features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <live/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <uri_transports>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <uri_transport>tcp</uri_transport>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <uri_transport>rdma</uri_transport>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </uri_transports>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </migration_features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <topology>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <cells num='1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <cell id='0'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <memory unit='KiB'>7864292</memory>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <pages unit='KiB' size='4'>1966073</pages>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <pages unit='KiB' size='2048'>0</pages>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <distances>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <sibling id='0' value='10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          </distances>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          <cpus num='8'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:          </cpus>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        </cell>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </cells>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </topology>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <cache>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </cache>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <secmodel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model>selinux</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <doi>0</doi>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </secmodel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <secmodel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model>dac</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <doi>0</doi>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </secmodel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </host>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <guest>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <os_type>hvm</os_type>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <arch name='i686'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <wordsize>32</wordsize>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <domain type='qemu'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <domain type='kvm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <pae/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <nonpae/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <acpi default='on' toggle='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <apic default='on' toggle='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <cpuselection/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <deviceboot/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <externalSnapshot/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </guest>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <guest>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <os_type>hvm</os_type>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <arch name='x86_64'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <wordsize>64</wordsize>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <domain type='qemu'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <domain type='kvm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <acpi default='on' toggle='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <apic default='on' toggle='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <cpuselection/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <deviceboot/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <externalSnapshot/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </guest>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </capabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: #033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.465 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.483 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 30 04:16:48 np0005601977 nova_compute[182190]: <domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <domain>kvm</domain>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <arch>i686</arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <vcpu max='240'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <iothreads supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <os supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='firmware'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <loader supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>rom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pflash</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='readonly'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>yes</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='secure'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </loader>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </os>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='maximumMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <vendor>AMD</vendor>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='succor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='custom' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <memoryBacking supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='sourceType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>anonymous</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>memfd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </memoryBacking>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <disk supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='diskDevice'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>disk</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cdrom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>floppy</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>lun</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ide</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>fdc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>sata</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </disk>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <graphics supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vnc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egl-headless</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </graphics>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <video supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='modelType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vga</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cirrus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>none</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>bochs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ramfb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </video>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hostdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='mode'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>subsystem</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='startupPolicy'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>mandatory</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>requisite</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>optional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='subsysType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pci</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='capsType'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='pciBackend'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hostdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <rng supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>random</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </rng>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <filesystem supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='driverType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>path</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>handle</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtiofs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </filesystem>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tpm supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-tis</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-crb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emulator</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>external</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendVersion'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>2.0</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </tpm>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <redirdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </redirdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <channel supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </channel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <crypto supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </crypto>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <interface supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>passt</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </interface>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <panic supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>isa</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>hyperv</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </panic>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <console supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>null</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dev</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pipe</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stdio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>udp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tcp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu-vdagent</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </console>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <gic supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <genid supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backup supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <async-teardown supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <s390-pv supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <ps2 supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tdx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sev supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sgx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hyperv supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='features'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>relaxed</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vapic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>spinlocks</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vpindex</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>runtime</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>synic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stimer</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reset</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vendor_id</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>frequencies</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reenlightenment</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tlbflush</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ipi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>avic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emsr_bitmap</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>xmm_input</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hyperv>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <launchSecurity supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.491 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 30 04:16:48 np0005601977 nova_compute[182190]: <domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <domain>kvm</domain>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <arch>i686</arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <vcpu max='4096'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <iothreads supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <os supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='firmware'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <loader supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>rom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pflash</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='readonly'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>yes</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='secure'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </loader>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </os>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='maximumMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <vendor>AMD</vendor>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='succor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='custom' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <memoryBacking supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='sourceType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>anonymous</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>memfd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </memoryBacking>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <disk supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='diskDevice'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>disk</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cdrom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>floppy</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>lun</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>fdc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>sata</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </disk>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <graphics supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vnc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egl-headless</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </graphics>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <video supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='modelType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vga</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cirrus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>none</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>bochs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ramfb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </video>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hostdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='mode'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>subsystem</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='startupPolicy'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>mandatory</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>requisite</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>optional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='subsysType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pci</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='capsType'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='pciBackend'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hostdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <rng supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>random</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </rng>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <filesystem supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='driverType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>path</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>handle</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtiofs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </filesystem>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tpm supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-tis</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-crb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emulator</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>external</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendVersion'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>2.0</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </tpm>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <redirdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </redirdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <channel supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </channel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <crypto supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </crypto>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <interface supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>passt</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </interface>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <panic supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>isa</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>hyperv</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </panic>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <console supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>null</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dev</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pipe</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stdio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>udp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tcp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu-vdagent</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </console>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <gic supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <genid supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backup supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <async-teardown supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <s390-pv supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <ps2 supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tdx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sev supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sgx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hyperv supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='features'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>relaxed</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vapic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>spinlocks</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vpindex</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>runtime</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>synic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stimer</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reset</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vendor_id</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>frequencies</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reenlightenment</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tlbflush</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ipi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>avic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emsr_bitmap</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>xmm_input</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hyperv>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <launchSecurity supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.567 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.571 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 30 04:16:48 np0005601977 nova_compute[182190]: <domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <domain>kvm</domain>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <arch>x86_64</arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <vcpu max='240'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <iothreads supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <os supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='firmware'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <loader supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>rom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pflash</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='readonly'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>yes</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='secure'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </loader>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </os>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='maximumMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <vendor>AMD</vendor>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='succor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='custom' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <memoryBacking supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='sourceType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>anonymous</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>memfd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </memoryBacking>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <disk supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='diskDevice'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>disk</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cdrom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>floppy</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>lun</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ide</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>fdc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>sata</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </disk>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <graphics supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vnc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egl-headless</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </graphics>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <video supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='modelType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vga</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cirrus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>none</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>bochs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ramfb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </video>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hostdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='mode'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>subsystem</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='startupPolicy'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>mandatory</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>requisite</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>optional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='subsysType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pci</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='capsType'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='pciBackend'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hostdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <rng supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>random</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </rng>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <filesystem supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='driverType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>path</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>handle</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtiofs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </filesystem>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tpm supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-tis</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-crb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emulator</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>external</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendVersion'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>2.0</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </tpm>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <redirdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </redirdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <channel supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </channel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <crypto supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </crypto>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <interface supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>passt</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </interface>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <panic supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>isa</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>hyperv</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </panic>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <console supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>null</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dev</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pipe</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stdio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>udp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tcp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu-vdagent</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </console>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <gic supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <genid supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backup supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <async-teardown supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <s390-pv supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <ps2 supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tdx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sev supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sgx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hyperv supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='features'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>relaxed</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vapic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>spinlocks</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vpindex</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>runtime</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>synic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stimer</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reset</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vendor_id</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>frequencies</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reenlightenment</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tlbflush</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ipi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>avic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emsr_bitmap</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>xmm_input</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hyperv>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <launchSecurity supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.635 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 30 04:16:48 np0005601977 nova_compute[182190]: <domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <domain>kvm</domain>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <arch>x86_64</arch>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <vcpu max='4096'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <iothreads supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <os supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='firmware'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>efi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <loader supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>rom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pflash</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='readonly'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>yes</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='secure'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>yes</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>no</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </loader>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </os>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='maximumMigratable'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>on</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>off</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <vendor>AMD</vendor>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='succor'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <mode name='custom' supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ddpd-u'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sha512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm3'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sm4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Denverton-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amd-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='auto-ibrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='perfmon-v2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbpb'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='stibp-always-on'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='EPYC-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-128'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-256'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx10-512'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='prefetchiti'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Haswell-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512er'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512pf'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fma4'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tbm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xop'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='amx-tile'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-bf16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-fp16'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bitalg'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrc'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fzrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='la57'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='taa-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ifma'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cmpccxadd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fbsdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='fsrs'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ibrs-all'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='intel-psfd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='lam'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mcdt-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pbrsb-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='psdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='serialize'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vaes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='hle'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='rtm'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512bw'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512cd'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512dq'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512f'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='avx512vl'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='invpcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pcid'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='pku'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='mpx'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='core-capability'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='split-lock-detect'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='cldemote'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='erms'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='gfni'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdir64b'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='movdiri'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='xsaves'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='athlon-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='core2duo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='coreduo-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='n270-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='ss'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <blockers model='phenom-v1'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnow'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <feature name='3dnowext'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </blockers>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </mode>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <memoryBacking supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <enum name='sourceType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>anonymous</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <value>memfd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </memoryBacking>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <disk supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='diskDevice'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>disk</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cdrom</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>floppy</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>lun</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>fdc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>sata</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </disk>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <graphics supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vnc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egl-headless</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </graphics>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <video supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='modelType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vga</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>cirrus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>none</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>bochs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ramfb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </video>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hostdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='mode'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>subsystem</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='startupPolicy'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>mandatory</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>requisite</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>optional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='subsysType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pci</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>scsi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='capsType'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='pciBackend'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hostdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <rng supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtio-non-transitional</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>random</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>egd</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </rng>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <filesystem supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='driverType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>path</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>handle</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>virtiofs</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </filesystem>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tpm supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-tis</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tpm-crb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emulator</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>external</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendVersion'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>2.0</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </tpm>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <redirdev supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='bus'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>usb</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </redirdev>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <channel supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </channel>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <crypto supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendModel'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>builtin</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </crypto>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <interface supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='backendType'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>default</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>passt</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </interface>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <panic supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='model'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>isa</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>hyperv</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </panic>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <console supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='type'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>null</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vc</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pty</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dev</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>file</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>pipe</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stdio</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>udp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tcp</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>unix</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>qemu-vdagent</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>dbus</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </console>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </devices>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <gic supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <genid supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <backup supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <async-teardown supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <s390-pv supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <ps2 supported='yes'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <tdx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sev supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <sgx supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <hyperv supported='yes'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <enum name='features'>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>relaxed</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vapic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>spinlocks</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vpindex</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>runtime</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>synic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>stimer</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reset</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>vendor_id</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>frequencies</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>reenlightenment</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>tlbflush</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>ipi</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>avic</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>emsr_bitmap</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <value>xmm_input</value>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </enum>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      <defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:      </defaults>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    </hyperv>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:    <launchSecurity supported='no'/>
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  </features>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </domainCapabilities>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.697 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.698 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.698 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.702 182194 INFO nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Secure Boot support detected#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.704 182194 INFO nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.705 182194 INFO nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.712 182194 DEBUG nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 30 04:16:48 np0005601977 nova_compute[182190]:  <model>Nehalem</model>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: </cpu>
Jan 30 04:16:48 np0005601977 nova_compute[182190]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.714 182194 DEBUG nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.751 182194 INFO nova.virt.node [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Determined node identity eb11f67d-14b4-46ee-89fd-92936c45ed58 from /var/lib/nova/compute_id#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.783 182194 WARNING nova.compute.manager [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Compute nodes ['eb11f67d-14b4-46ee-89fd-92936c45ed58'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.823 182194 INFO nova.compute.manager [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.859 182194 WARNING nova.compute.manager [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.859 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.860 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.860 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:16:48 np0005601977 nova_compute[182190]: 2026-01-30 09:16:48.860 182194 DEBUG nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:16:48 np0005601977 systemd[1]: Starting libvirt nodedev daemon...
Jan 30 04:16:48 np0005601977 systemd[1]: Started libvirt nodedev daemon.
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.108 182194 WARNING nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.109 182194 DEBUG nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6179MB free_disk=73.58016967773438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.109 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.110 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.138 182194 WARNING nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] No compute node record for compute-0.ctlplane.example.com:eb11f67d-14b4-46ee-89fd-92936c45ed58: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host eb11f67d-14b4-46ee-89fd-92936c45ed58 could not be found.#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.168 182194 INFO nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: eb11f67d-14b4-46ee-89fd-92936c45ed58#033[00m
Jan 30 04:16:49 np0005601977 python3.9[182873]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.246 182194 DEBUG nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:16:49 np0005601977 nova_compute[182190]: 2026-01-30 09:16:49.247 182194 DEBUG nova.compute.resource_tracker [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:16:49 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:16:50 np0005601977 python3.9[183069]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.141 182194 INFO nova.scheduler.client.report [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] [req-9245bc38-cc95-43f6-9ee3-c6b0a133b401] Created resource provider record via placement API for resource provider with UUID eb11f67d-14b4-46ee-89fd-92936c45ed58 and name compute-0.ctlplane.example.com.#033[00m
Jan 30 04:16:50 np0005601977 systemd[1]: Stopping nova_compute container...
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.171 182194 DEBUG nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 30 04:16:50 np0005601977 nova_compute[182190]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.171 182194 INFO nova.virt.libvirt.host [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.172 182194 DEBUG nova.compute.provider_tree [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.172 182194 DEBUG nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.174 182194 DEBUG nova.virt.libvirt.driver [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Libvirt baseline CPU <cpu>
Jan 30 04:16:50 np0005601977 nova_compute[182190]:  <arch>x86_64</arch>
Jan 30 04:16:50 np0005601977 nova_compute[182190]:  <model>Nehalem</model>
Jan 30 04:16:50 np0005601977 nova_compute[182190]:  <vendor>AMD</vendor>
Jan 30 04:16:50 np0005601977 nova_compute[182190]:  <topology sockets="8" cores="1" threads="1"/>
Jan 30 04:16:50 np0005601977 nova_compute[182190]: </cpu>
Jan 30 04:16:50 np0005601977 nova_compute[182190]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.208 182194 DEBUG oslo_concurrency.lockutils [None req-87e8d04b-60f3-42e7-a9d2-01bd2a3082a6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.209 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.209 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:16:50 np0005601977 nova_compute[182190]: 2026-01-30 09:16:50.209 182194 DEBUG oslo_concurrency.lockutils [None req-b2c79e12-d9ea-498e-b6dd-083835d9a377 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:16:50 np0005601977 virtqemud[182587]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 30 04:16:50 np0005601977 virtqemud[182587]: hostname: compute-0
Jan 30 04:16:50 np0005601977 virtqemud[182587]: End of file while reading data: Input/output error
Jan 30 04:16:50 np0005601977 systemd[1]: libpod-29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982.scope: Deactivated successfully.
Jan 30 04:16:50 np0005601977 systemd[1]: libpod-29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982.scope: Consumed 3.026s CPU time.
Jan 30 04:16:50 np0005601977 podman[183073]: 2026-01-30 09:16:50.741061858 +0000 UTC m=+0.575478242 container died 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 30 04:16:50 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982-userdata-shm.mount: Deactivated successfully.
Jan 30 04:16:50 np0005601977 systemd[1]: var-lib-containers-storage-overlay-ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd-merged.mount: Deactivated successfully.
Jan 30 04:16:50 np0005601977 podman[183073]: 2026-01-30 09:16:50.806652293 +0000 UTC m=+0.641068687 container cleanup 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:16:50 np0005601977 podman[183073]: nova_compute
Jan 30 04:16:50 np0005601977 podman[183101]: nova_compute
Jan 30 04:16:50 np0005601977 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 30 04:16:50 np0005601977 systemd[1]: Stopped nova_compute container.
Jan 30 04:16:50 np0005601977 systemd[1]: Starting nova_compute container...
Jan 30 04:16:51 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:16:51 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:51 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:51 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:51 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:51 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebe54b2eb6e7f0933b409e7355f839c7735ab04da5df7f31704147b00187a3fd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:51 np0005601977 podman[183115]: 2026-01-30 09:16:51.320662346 +0000 UTC m=+0.427955764 container init 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:16:51 np0005601977 podman[183115]: 2026-01-30 09:16:51.326622687 +0000 UTC m=+0.433916075 container start 29e0929d0e45e12f17c947f3edef2ccdb48bf9bc5f7d742e4b9e0c81f459a982 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute)
Jan 30 04:16:51 np0005601977 podman[183115]: nova_compute
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + sudo -E kolla_set_configs
Jan 30 04:16:51 np0005601977 systemd[1]: Started nova_compute container.
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Validating config file
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying service configuration files
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /etc/ceph
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Creating directory /etc/ceph
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /etc/ceph
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Writing out command to execute
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:51 np0005601977 nova_compute[183130]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:51 np0005601977 nova_compute[183130]: ++ cat /run_command
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + CMD=nova-compute
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + ARGS=
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + sudo kolla_copy_cacerts
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + [[ ! -n '' ]]
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + . kolla_extend_start
Jan 30 04:16:51 np0005601977 nova_compute[183130]: Running command: 'nova-compute'
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + echo 'Running command: '\''nova-compute'\'''
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + umask 0022
Jan 30 04:16:51 np0005601977 nova_compute[183130]: + exec nova-compute
Jan 30 04:16:52 np0005601977 python3.9[183293]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 30 04:16:52 np0005601977 systemd[1]: Started libpod-conmon-e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42.scope.
Jan 30 04:16:52 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:16:52 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68093f085271828abfb9ef2a51196fd3db1cf290eaaff8aaa83fb6c7553cd9e3/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68093f085271828abfb9ef2a51196fd3db1cf290eaaff8aaa83fb6c7553cd9e3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68093f085271828abfb9ef2a51196fd3db1cf290eaaff8aaa83fb6c7553cd9e3/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601977 podman[183315]: 2026-01-30 09:16:52.251222337 +0000 UTC m=+0.104035555 container init e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:16:52 np0005601977 podman[183315]: 2026-01-30 09:16:52.257136986 +0000 UTC m=+0.109950174 container start e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.build-date=20251202)
Jan 30 04:16:52 np0005601977 python3.9[183293]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Applying nova statedir ownership
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 30 04:16:52 np0005601977 nova_compute_init[183337]: INFO:nova_statedir:Nova statedir ownership complete
Jan 30 04:16:52 np0005601977 systemd[1]: libpod-e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601977 podman[183351]: 2026-01-30 09:16:52.355846348 +0000 UTC m=+0.043458313 container died e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Jan 30 04:16:52 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42-userdata-shm.mount: Deactivated successfully.
Jan 30 04:16:52 np0005601977 systemd[1]: var-lib-containers-storage-overlay-68093f085271828abfb9ef2a51196fd3db1cf290eaaff8aaa83fb6c7553cd9e3-merged.mount: Deactivated successfully.
Jan 30 04:16:52 np0005601977 podman[183351]: 2026-01-30 09:16:52.374213733 +0000 UTC m=+0.061825688 container cleanup e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:16:52 np0005601977 systemd[1]: libpod-conmon-e2a50a3815a15871dd7b4780442d10bfe79005bed021444ab57422f2f4404b42.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601977 systemd[1]: session-24.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601977 systemd[1]: session-24.scope: Consumed 1min 21.542s CPU time.
Jan 30 04:16:52 np0005601977 systemd-logind[809]: Session 24 logged out. Waiting for processes to exit.
Jan 30 04:16:52 np0005601977 systemd-logind[809]: Removed session 24.
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.189 183134 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.189 183134 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.190 183134 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.190 183134 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.326 183134 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.334 183134 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.334 183134 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.749 183134 INFO nova.virt.driver [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.840 183134 INFO nova.compute.provider_config [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.885 183134 DEBUG oslo_concurrency.lockutils [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.886 183134 DEBUG oslo_concurrency.lockutils [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.886 183134 DEBUG oslo_concurrency.lockutils [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.887 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.887 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.887 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.887 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.887 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.888 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.888 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.888 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.888 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.889 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.889 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.889 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.889 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.889 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.890 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.890 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.890 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.890 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.891 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.891 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.891 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.891 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.891 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.892 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.892 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.892 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.892 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.893 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.893 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.893 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.893 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.893 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.894 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.894 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.894 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.894 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.895 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.895 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.895 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.895 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.895 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.896 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.896 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.896 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.897 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.897 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.897 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.897 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.898 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.898 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.898 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.898 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.898 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.899 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.899 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.899 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.899 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.899 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.900 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.900 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.900 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.900 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.901 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.901 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.901 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.901 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.901 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.902 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.902 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.902 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.902 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.902 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.903 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.903 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.903 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.903 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.904 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.904 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.904 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.904 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.904 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.905 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.905 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.905 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.905 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.905 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.906 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.906 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.906 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.906 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.907 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.907 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.907 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.907 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.907 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.908 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.908 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.908 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.908 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.908 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.909 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.909 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.909 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.909 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.910 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.910 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.910 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.910 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.910 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.911 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.911 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.911 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.911 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.912 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.912 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.912 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.912 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.913 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.913 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.913 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.913 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.913 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.914 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.914 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.914 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.914 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.915 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.915 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.915 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.915 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.916 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.916 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.916 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.916 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.917 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.917 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.917 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.918 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.918 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.918 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.919 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.919 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.919 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.919 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.919 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.920 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.920 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.920 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.921 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.921 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.921 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.921 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.921 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.922 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.922 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.922 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.922 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.923 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.923 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.923 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.923 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.924 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.924 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.924 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.924 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.924 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.925 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.925 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.925 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.925 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.925 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.926 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.926 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.926 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.926 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.926 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.927 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.927 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.927 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.927 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.928 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.928 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.928 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.928 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.928 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.929 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.929 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.929 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.929 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.930 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.930 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.930 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.930 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.930 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.931 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.931 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.931 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.931 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.932 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.932 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.932 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.932 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.933 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.934 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.935 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.936 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.937 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.938 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.939 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.940 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.941 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.942 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.942 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.942 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.942 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.943 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.944 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.945 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.946 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.946 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.946 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.946 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.946 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.947 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.948 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.949 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.950 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.951 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.952 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.952 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.952 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.952 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.952 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.953 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.954 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.955 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.956 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.957 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.957 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.957 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.957 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.957 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.958 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.959 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.960 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.961 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.962 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.963 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.964 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.965 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.966 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.967 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.968 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.969 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.970 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.971 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.972 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.973 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.974 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.975 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.976 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 WARNING oslo_config.cfg [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 30 04:16:53 np0005601977 nova_compute[183130]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 30 04:16:53 np0005601977 nova_compute[183130]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 30 04:16:53 np0005601977 nova_compute[183130]: and ``live_migration_inbound_addr`` respectively.
Jan 30 04:16:53 np0005601977 nova_compute[183130]: ).  Its value may be silently ignored in the future.#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.977 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.978 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.979 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.980 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.981 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.982 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.983 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.984 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.985 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.986 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.987 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.988 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.989 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.990 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.991 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.992 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.993 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.994 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.995 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.996 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.997 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.998 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:53 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:53.999 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.000 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.000 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.000 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.000 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.000 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.001 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.002 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.003 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.004 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.004 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.004 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.004 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.004 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.005 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.006 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.007 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.008 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.009 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.010 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.011 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.012 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.013 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.014 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.015 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.016 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.017 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.018 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.019 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.020 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.021 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.021 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.021 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.021 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.021 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.022 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.023 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.024 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.025 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.026 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.027 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.028 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.028 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.028 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.028 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.028 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.029 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.030 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.031 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.032 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.033 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.034 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.035 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.036 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.037 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.038 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.039 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.040 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.041 183134 DEBUG oslo_service.service [None req-ffaf3eb0-fe22-4cbd-b320-3cbd2ee25b28 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.042 183134 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.064 183134 INFO nova.virt.node [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Determined node identity eb11f67d-14b4-46ee-89fd-92936c45ed58 from /var/lib/nova/compute_id#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.065 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.067 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.067 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.067 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.083 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fbf03318550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.086 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fbf03318550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.087 183134 INFO nova.virt.libvirt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.092 183134 INFO nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Libvirt host capabilities <capabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <host>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <uuid>84994b48-1455-435f-a6fe-1797df140bfa</uuid>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <arch>x86_64</arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model>EPYC-Rome-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <vendor>AMD</vendor>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <microcode version='16777317'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <signature family='23' model='49' stepping='0'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='x2apic'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='tsc-deadline'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='osxsave'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='hypervisor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='tsc_adjust'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='spec-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='stibp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='arch-capabilities'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='cmp_legacy'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='topoext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='virt-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='lbrv'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='tsc-scale'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='vmcb-clean'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='pause-filter'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='pfthreshold'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='svme-addr-chk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='rdctl-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='skip-l1dfl-vmentry'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='mds-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature name='pschange-mc-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <pages unit='KiB' size='4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <pages unit='KiB' size='2048'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <pages unit='KiB' size='1048576'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <power_management>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <suspend_mem/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <suspend_disk/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <suspend_hybrid/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </power_management>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <iommu support='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <migration_features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <live/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <uri_transports>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <uri_transport>tcp</uri_transport>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <uri_transport>rdma</uri_transport>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </uri_transports>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </migration_features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <topology>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <cells num='1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <cell id='0'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <memory unit='KiB'>7864292</memory>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <pages unit='KiB' size='4'>1966073</pages>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <pages unit='KiB' size='2048'>0</pages>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <distances>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <sibling id='0' value='10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          </distances>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          <cpus num='8'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:          </cpus>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        </cell>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </cells>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </topology>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <cache>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </cache>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <secmodel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model>selinux</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <doi>0</doi>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </secmodel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <secmodel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model>dac</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <doi>0</doi>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </secmodel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </host>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <guest>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <os_type>hvm</os_type>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <arch name='i686'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <wordsize>32</wordsize>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <domain type='qemu'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <domain type='kvm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <pae/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <nonpae/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <acpi default='on' toggle='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <apic default='on' toggle='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <cpuselection/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <deviceboot/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <externalSnapshot/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </guest>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <guest>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <os_type>hvm</os_type>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <arch name='x86_64'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <wordsize>64</wordsize>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <domain type='qemu'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <domain type='kvm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <acpi default='on' toggle='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <apic default='on' toggle='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <cpuselection/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <deviceboot/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <externalSnapshot/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </guest>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 
Jan 30 04:16:54 np0005601977 nova_compute[183130]: </capabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: #033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.097 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.101 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 30 04:16:54 np0005601977 nova_compute[183130]: <domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <domain>kvm</domain>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <arch>i686</arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <vcpu max='4096'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <iothreads supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <os supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='firmware'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <loader supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>rom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pflash</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='readonly'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>yes</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='secure'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </loader>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='maximumMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <vendor>AMD</vendor>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='succor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='custom' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <memoryBacking supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='sourceType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>anonymous</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>memfd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </memoryBacking>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <disk supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='diskDevice'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>disk</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cdrom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>floppy</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>lun</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>fdc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>sata</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <graphics supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vnc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egl-headless</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <video supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='modelType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vga</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cirrus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>none</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>bochs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ramfb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hostdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='mode'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>subsystem</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='startupPolicy'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>mandatory</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>requisite</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>optional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='subsysType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pci</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='capsType'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='pciBackend'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hostdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <rng supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>random</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <filesystem supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='driverType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>path</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>handle</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtiofs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </filesystem>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tpm supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-tis</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-crb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emulator</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>external</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendVersion'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>2.0</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </tpm>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <redirdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </redirdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <channel supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </channel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <crypto supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </crypto>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <interface supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>passt</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <panic supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>isa</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>hyperv</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </panic>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <console supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>null</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dev</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pipe</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stdio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>udp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tcp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu-vdagent</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <gic supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <genid supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backup supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <async-teardown supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <s390-pv supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <ps2 supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tdx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sev supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sgx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hyperv supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='features'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>relaxed</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vapic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>spinlocks</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vpindex</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>runtime</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>synic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stimer</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reset</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vendor_id</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>frequencies</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reenlightenment</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tlbflush</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ipi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>avic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emsr_bitmap</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>xmm_input</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hyperv>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <launchSecurity supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: </domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.110 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 30 04:16:54 np0005601977 nova_compute[183130]: <domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <domain>kvm</domain>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <arch>i686</arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <vcpu max='240'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <iothreads supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <os supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='firmware'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <loader supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>rom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pflash</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='readonly'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>yes</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='secure'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </loader>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='maximumMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <vendor>AMD</vendor>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='succor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='custom' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <memoryBacking supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='sourceType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>anonymous</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>memfd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </memoryBacking>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <disk supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='diskDevice'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>disk</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cdrom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>floppy</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>lun</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ide</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>fdc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>sata</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <graphics supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vnc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egl-headless</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <video supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='modelType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vga</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cirrus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>none</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>bochs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ramfb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hostdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='mode'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>subsystem</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='startupPolicy'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>mandatory</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>requisite</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>optional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='subsysType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pci</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='capsType'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='pciBackend'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hostdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <rng supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>random</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <filesystem supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='driverType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>path</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>handle</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtiofs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </filesystem>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tpm supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-tis</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-crb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emulator</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>external</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendVersion'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>2.0</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </tpm>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <redirdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </redirdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <channel supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </channel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <crypto supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </crypto>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <interface supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>passt</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <panic supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>isa</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>hyperv</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </panic>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <console supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>null</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dev</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pipe</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stdio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>udp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tcp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu-vdagent</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <gic supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <genid supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backup supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <async-teardown supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <s390-pv supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <ps2 supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tdx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sev supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sgx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hyperv supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='features'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>relaxed</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vapic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>spinlocks</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vpindex</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>runtime</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>synic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stimer</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reset</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vendor_id</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>frequencies</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reenlightenment</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tlbflush</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ipi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>avic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emsr_bitmap</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>xmm_input</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hyperv>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <launchSecurity supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: </domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.147 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.150 183134 DEBUG nova.virt.libvirt.volume.mount [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.154 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 30 04:16:54 np0005601977 nova_compute[183130]: <domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <domain>kvm</domain>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <arch>x86_64</arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <vcpu max='4096'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <iothreads supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <os supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='firmware'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>efi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <loader supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>rom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pflash</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='readonly'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>yes</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='secure'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>yes</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </loader>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='maximumMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <vendor>AMD</vendor>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='succor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='custom' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibpb-brtype'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbpb'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-128'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-256'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx10-512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Haswell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='IvyBridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='KnightsMill-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4fmaps'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-4vnniw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512er'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512pf'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fma4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tbm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xop'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amx-tile'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-fp16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrc'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fzrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='tsx-ldtrk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='SierraForest-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='core-capability'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='split-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Snowridge-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='athlon-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='core2duo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='coreduo-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='n270-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='phenom-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnow'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='3dnowext'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <memoryBacking supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='sourceType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>anonymous</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>memfd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </memoryBacking>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <disk supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='diskDevice'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>disk</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cdrom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>floppy</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>lun</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>fdc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>sata</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <graphics supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vnc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egl-headless</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <video supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='modelType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vga</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>cirrus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>none</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>bochs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ramfb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hostdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='mode'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>subsystem</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='startupPolicy'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>mandatory</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>requisite</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>optional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='subsysType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pci</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>scsi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='capsType'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='pciBackend'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hostdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <rng supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtio-non-transitional</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>random</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>egd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <filesystem supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='driverType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>path</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>handle</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>virtiofs</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </filesystem>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tpm supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-tis</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tpm-crb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emulator</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>external</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendVersion'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>2.0</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </tpm>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <redirdev supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='bus'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>usb</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </redirdev>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <channel supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </channel>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <crypto supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendModel'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>builtin</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </crypto>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <interface supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='backendType'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>default</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>passt</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <panic supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='model'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>isa</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>hyperv</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </panic>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <console supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>null</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vc</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pty</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dev</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>file</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pipe</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stdio</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>udp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tcp</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>unix</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>qemu-vdagent</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>dbus</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <gic supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <vmcoreinfo supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <genid supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backingStoreInput supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <backup supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <async-teardown supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <s390-pv supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <ps2 supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <tdx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sev supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <sgx supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <hyperv supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='features'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>relaxed</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vapic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>spinlocks</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vpindex</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>runtime</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>synic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>stimer</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reset</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>vendor_id</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>frequencies</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>reenlightenment</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>tlbflush</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>ipi</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>avic</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>emsr_bitmap</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>xmm_input</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <spinlocks>4095</spinlocks>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <stimer_direct>on</stimer_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </defaults>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </hyperv>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <launchSecurity supported='no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: </domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:16:54 np0005601977 nova_compute[183130]: 2026-01-30 09:16:54.209 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 30 04:16:54 np0005601977 nova_compute[183130]: <domainCapabilities>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <domain>kvm</domain>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <arch>x86_64</arch>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <vcpu max='240'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <iothreads supported='yes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <os supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <enum name='firmware'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <loader supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='type'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>rom</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>pflash</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='readonly'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>yes</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='secure'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>no</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </loader>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:  <cpu>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='maximum' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <enum name='maximumMigratable'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>on</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <value>off</value>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </enum>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='host-model' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <vendor>AMD</vendor>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='x2apic'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='stibp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='succor'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lbrv'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    </mode>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:    <mode name='custom' supported='yes'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Broadwell-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-ne-convert'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx-vnni-int8'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bhi-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='bus-lock-detect'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cldemote'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='cmpccxadd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ddpd-u'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fbsdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='intel-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ipred-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='lam'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mcdt-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdir64b'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='movdiri'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pbrsb-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='prefetchiti'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='psdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rrsba-ctrl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='serialize'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sha512'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm3'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='sm4'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ss'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Cooperlake-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='hle'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='ibrs-all'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='rtm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='taa-no'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='mpx'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Denverton-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='Dhyana-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='auto-ibrs'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-bf16'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bitalg'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512bw'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512cd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512dq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512f'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512ifma'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vbmi2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vl'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='avx512vnni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='gfni'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='la57'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='perfmon-v2'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='amd-psfd'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='erms'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='fsrm'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='invpcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='no-nested-data-bp'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='null-sel-clr-base'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pcid'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='pku'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='stibp-always-on'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vaes'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='vpclmulqdq'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:        <feature name='xsaves'/>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      </blockers>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:16:54 np0005601977 nova_compute[183130]:      <blockers model='EPYC-Turin'>
Jan 30 04:18:12 np0005601977 rsyslogd[1006]: imjournal: 2272 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 30 04:18:12 np0005601977 python3.9[195790]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:12 np0005601977 systemd[1]: Reloading.
Jan 30 04:18:12 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:12 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:12 np0005601977 systemd[1]: Starting node_exporter container...
Jan 30 04:18:12 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:18:12 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68bc638d3a1f4944c5a97ad1071ae0ebb6de59f071850f1f8e0defaca329a507/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:12 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68bc638d3a1f4944c5a97ad1071ae0ebb6de59f071850f1f8e0defaca329a507/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:12 np0005601977 systemd[1]: Started /usr/bin/podman healthcheck run 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13.
Jan 30 04:18:12 np0005601977 podman[195831]: 2026-01-30 09:18:12.758614351 +0000 UTC m=+0.104421451 container init 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.769Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.769Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.769Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.770Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.770Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.770Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.770Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.770Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=arp
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=bcache
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=bonding
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=cpu
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=edac
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=filefd
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=netclass
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=netdev
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=netstat
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=nfs
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=nvme
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=softnet
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=systemd
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=xfs
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=node_exporter.go:117 level=info collector=zfs
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.771Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 30 04:18:12 np0005601977 node_exporter[195846]: ts=2026-01-30T09:18:12.772Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 30 04:18:12 np0005601977 podman[195831]: 2026-01-30 09:18:12.786439883 +0000 UTC m=+0.132246973 container start 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:18:12 np0005601977 podman[195831]: node_exporter
Jan 30 04:18:12 np0005601977 systemd[1]: Started node_exporter container.
Jan 30 04:18:12 np0005601977 podman[195855]: 2026-01-30 09:18:12.848536659 +0000 UTC m=+0.048956423 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:18:14 np0005601977 python3.9[196028]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:15 np0005601977 python3.9[196180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:16 np0005601977 python3.9[196305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764695.4897962-2160-115385826326173/.source.yaml _original_basename=.55bnencl follow=False checksum=42128c20150d024023dad565fc076bdb6d93d087 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:17 np0005601977 python3.9[196457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:17 np0005601977 python3.9[196580]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764696.7450695-2205-41174482927930/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:19 np0005601977 python3.9[196733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:20 np0005601977 python3.9[196885]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:20 np0005601977 python3.9[197037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:21 np0005601977 python3.9[197115]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.x3hxc7j5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:21 np0005601977 python3.9[197265]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:24 np0005601977 python3.9[197688]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 30 04:18:24 np0005601977 podman[197742]: 2026-01-30 09:18:24.830850322 +0000 UTC m=+0.051337351 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:18:24 np0005601977 systemd[1]: b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91-632b452c850bf3ff.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:24 np0005601977 systemd[1]: b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91-632b452c850bf3ff.service: Failed with result 'exit-code'.
Jan 30 04:18:25 np0005601977 python3.9[197859]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:18:26 np0005601977 python3[198011]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:18:27 np0005601977 podman[198025]: 2026-01-30 09:18:27.831479596 +0000 UTC m=+1.670200450 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 30 04:18:27 np0005601977 podman[198121]: 2026-01-30 09:18:27.943123042 +0000 UTC m=+0.039831814 container create c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:18:27 np0005601977 podman[198121]: 2026-01-30 09:18:27.924340597 +0000 UTC m=+0.021049389 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 30 04:18:27 np0005601977 python3[198011]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 30 04:18:28 np0005601977 python3.9[198311]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:29 np0005601977 python3.9[198465]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:29 np0005601977 python3.9[198541]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:30 np0005601977 python3.9[198692]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764709.9076622-2541-20900475141895/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:31 np0005601977 python3.9[198768]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:18:31 np0005601977 systemd[1]: Reloading.
Jan 30 04:18:31 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:31 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:31 np0005601977 python3.9[198878]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:31 np0005601977 systemd[1]: Reloading.
Jan 30 04:18:32 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:32 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:32 np0005601977 systemd[1]: Starting podman_exporter container...
Jan 30 04:18:32 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:18:32 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdff98b021afa77121ab6632d7e83d189c4290048002656a9de95471990430aa/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:32 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdff98b021afa77121ab6632d7e83d189c4290048002656a9de95471990430aa/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:32 np0005601977 systemd[1]: Started /usr/bin/podman healthcheck run c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32.
Jan 30 04:18:32 np0005601977 podman[198917]: 2026-01-30 09:18:32.403249432 +0000 UTC m=+0.123340839 container init c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.415Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.415Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.415Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.415Z caller=handler.go:105 level=info collector=container
Jan 30 04:18:32 np0005601977 podman[198917]: 2026-01-30 09:18:32.428531372 +0000 UTC m=+0.148622809 container start c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:18:32 np0005601977 podman[198917]: podman_exporter
Jan 30 04:18:32 np0005601977 systemd[1]: Starting Podman API Service...
Jan 30 04:18:32 np0005601977 systemd[1]: Started Podman API Service.
Jan 30 04:18:32 np0005601977 systemd[1]: Started podman_exporter container.
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="Setting parallel job count to 25"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="Using sqlite as database backend"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 30 04:18:32 np0005601977 podman[198944]: @ - - [30/Jan/2026:09:18:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 30 04:18:32 np0005601977 podman[198944]: time="2026-01-30T09:18:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 30 04:18:32 np0005601977 podman[198944]: @ - - [30/Jan/2026:09:18:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18076 "" "Go-http-client/1.1"
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.492Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.493Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 30 04:18:32 np0005601977 podman_exporter[198933]: ts=2026-01-30T09:18:32.493Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 30 04:18:32 np0005601977 podman[198942]: 2026-01-30 09:18:32.507538269 +0000 UTC m=+0.071802043 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:18:32 np0005601977 systemd[1]: c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32-5a9a6f62fbb56568.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:32 np0005601977 systemd[1]: c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32-5a9a6f62fbb56568.service: Failed with result 'exit-code'.
Jan 30 04:18:33 np0005601977 podman[199102]: 2026-01-30 09:18:33.642929585 +0000 UTC m=+0.074997183 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 30 04:18:33 np0005601977 python3.9[199139]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:35 np0005601977 python3.9[199301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:35 np0005601977 python3.9[199426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764714.6746404-2676-71142428005873/.source.yaml _original_basename=.juv98qs7 follow=False checksum=441100862386c9bbe2f594fa4146745dc81605f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:36 np0005601977 python3.9[199578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:37 np0005601977 python3.9[199701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764716.108166-2721-59902279665357/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:37 np0005601977 podman[199702]: 2026-01-30 09:18:37.190037668 +0000 UTC m=+0.141711127 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:18:38 np0005601977 python3.9[199881]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:39 np0005601977 python3.9[200033]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:39 np0005601977 python3.9[200185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:40 np0005601977 python3.9[200263]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.26_83mn5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:41 np0005601977 python3.9[200413]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:43 np0005601977 podman[200808]: 2026-01-30 09:18:43.414963661 +0000 UTC m=+0.045510931 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:18:43 np0005601977 python3.9[200859]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 30 04:18:44 np0005601977 python3.9[201011]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:18:45 np0005601977 python3[201163]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:18:48 np0005601977 podman[201177]: 2026-01-30 09:18:48.013747348 +0000 UTC m=+2.371144188 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:48 np0005601977 podman[201276]: 2026-01-30 09:18:48.139703645 +0000 UTC m=+0.048325581 container create b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, release=1769056855, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, distribution-scope=public, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Jan 30 04:18:48 np0005601977 podman[201276]: 2026-01-30 09:18:48.114124355 +0000 UTC m=+0.022746291 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:48 np0005601977 python3[201163]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:49 np0005601977 python3.9[201467]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:50 np0005601977 python3.9[201621]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:50 np0005601977 python3.9[201697]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:51 np0005601977 python3.9[201848]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764730.991821-3057-257161688394004/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:52 np0005601977 python3.9[201924]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:18:52 np0005601977 systemd[1]: Reloading.
Jan 30 04:18:52 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:52 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:53 np0005601977 python3.9[202036]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:53 np0005601977 systemd[1]: Reloading.
Jan 30 04:18:53 np0005601977 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:53 np0005601977 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:53 np0005601977 systemd[1]: Starting openstack_network_exporter container...
Jan 30 04:18:53 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:18:53 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35014d26b01ff029570ce21e947bdc1a93383588772f186e72c7c31d2f25d2a5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35014d26b01ff029570ce21e947bdc1a93383588772f186e72c7c31d2f25d2a5/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35014d26b01ff029570ce21e947bdc1a93383588772f186e72c7c31d2f25d2a5/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601977 systemd[1]: Started /usr/bin/podman healthcheck run b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941.
Jan 30 04:18:53 np0005601977 podman[202076]: 2026-01-30 09:18:53.561923277 +0000 UTC m=+0.183975504 container init b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, architecture=x86_64, managed_by=edpm_ansible)
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *bridge.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *coverage.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *datapath.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *iface.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *memory.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *ovn.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *pmd_perf.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *pmd_rxq.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: INFO    09:18:53 main.go:48: registering *vswitch.Collector
Jan 30 04:18:53 np0005601977 openstack_network_exporter[202092]: NOTICE  09:18:53 main.go:76: listening on https://:9105/metrics
Jan 30 04:18:53 np0005601977 podman[202076]: 2026-01-30 09:18:53.594724074 +0000 UTC m=+0.216776341 container start b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:18:53 np0005601977 podman[202076]: openstack_network_exporter
Jan 30 04:18:53 np0005601977 systemd[1]: Started openstack_network_exporter container.
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.621 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.636 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.636 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.637 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.637 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.637 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.637 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.655 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.655 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.656 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.656 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:18:53 np0005601977 podman[202102]: 2026-01-30 09:18:53.67795044 +0000 UTC m=+0.072143011 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, version=9.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container)
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.781 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.782 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5782MB free_disk=73.3651237487793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.782 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.783 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.907 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.908 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.945 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.956 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.957 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:18:53 np0005601977 nova_compute[183130]: 2026-01-30 09:18:53.958 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:54 np0005601977 nova_compute[183130]: 2026-01-30 09:18:54.663 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:54 np0005601977 nova_compute[183130]: 2026-01-30 09:18:54.664 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:54 np0005601977 python3.9[202275]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:55 np0005601977 nova_compute[183130]: 2026-01-30 09:18:55.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:55 np0005601977 nova_compute[183130]: 2026-01-30 09:18:55.341 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:55 np0005601977 nova_compute[183130]: 2026-01-30 09:18:55.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:18:55 np0005601977 nova_compute[183130]: 2026-01-30 09:18:55.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:18:55 np0005601977 nova_compute[183130]: 2026-01-30 09:18:55.357 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:18:55 np0005601977 podman[202300]: 2026-01-30 09:18:55.840000217 +0000 UTC m=+0.053342704 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:18:55 np0005601977 systemd[1]: b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91-632b452c850bf3ff.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:55 np0005601977 systemd[1]: b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91-632b452c850bf3ff.service: Failed with result 'exit-code'.
Jan 30 04:18:56 np0005601977 python3.9[202444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:56 np0005601977 auditd[701]: Audit daemon rotating log files
Jan 30 04:18:56 np0005601977 python3.9[202569]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764735.803029-3192-23785320271920/.source.yaml _original_basename=.ysujyfg9 follow=False checksum=c5fad1fa35e900faf5323c50ca144cf21063d1e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:18:57.371 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:18:57.372 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:18:57.372 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:57 np0005601977 python3.9[202721]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:18:58 np0005601977 python3.9[202873]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 30 04:18:59 np0005601977 python3.9[203038]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:18:59 np0005601977 systemd[1]: Started libpod-conmon-92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851.scope.
Jan 30 04:18:59 np0005601977 podman[203039]: 2026-01-30 09:18:59.655108568 +0000 UTC m=+0.096165527 container exec 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Jan 30 04:18:59 np0005601977 podman[203039]: 2026-01-30 09:18:59.687577565 +0000 UTC m=+0.128634564 container exec_died 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:18:59 np0005601977 systemd[1]: libpod-conmon-92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851.scope: Deactivated successfully.
Jan 30 04:19:00 np0005601977 python3.9[203222]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:00 np0005601977 systemd[1]: Started libpod-conmon-92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851.scope.
Jan 30 04:19:00 np0005601977 podman[203223]: 2026-01-30 09:19:00.377942809 +0000 UTC m=+0.064522344 container exec 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 30 04:19:00 np0005601977 podman[203223]: 2026-01-30 09:19:00.417417276 +0000 UTC m=+0.103996791 container exec_died 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:19:00 np0005601977 systemd[1]: libpod-conmon-92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851.scope: Deactivated successfully.
Jan 30 04:19:01 np0005601977 python3.9[203409]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:01 np0005601977 python3.9[203561]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 30 04:19:02 np0005601977 python3.9[203726]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:02 np0005601977 systemd[1]: Started libpod-conmon-9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d.scope.
Jan 30 04:19:02 np0005601977 podman[203727]: 2026-01-30 09:19:02.499931871 +0000 UTC m=+0.074068786 container exec 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:19:02 np0005601977 podman[203746]: 2026-01-30 09:19:02.581367677 +0000 UTC m=+0.072750159 container exec_died 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:19:02 np0005601977 podman[203727]: 2026-01-30 09:19:02.586795462 +0000 UTC m=+0.160932387 container exec_died 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:19:02 np0005601977 systemd[1]: libpod-conmon-9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d.scope: Deactivated successfully.
Jan 30 04:19:02 np0005601977 podman[203757]: 2026-01-30 09:19:02.660790285 +0000 UTC m=+0.048481215 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:03 np0005601977 python3.9[203934]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:03 np0005601977 systemd[1]: Started libpod-conmon-9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d.scope.
Jan 30 04:19:03 np0005601977 podman[203935]: 2026-01-30 09:19:03.264741111 +0000 UTC m=+0.076117035 container exec 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:19:03 np0005601977 podman[203935]: 2026-01-30 09:19:03.27451273 +0000 UTC m=+0.085888574 container exec_died 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:19:03 np0005601977 systemd[1]: libpod-conmon-9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d.scope: Deactivated successfully.
Jan 30 04:19:03 np0005601977 podman[204092]: 2026-01-30 09:19:03.732797206 +0000 UTC m=+0.054466666 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:19:03 np0005601977 python3.9[204130]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:04 np0005601977 python3.9[204290]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 30 04:19:05 np0005601977 python3.9[204453]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:05 np0005601977 systemd[1]: Started libpod-conmon-b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91.scope.
Jan 30 04:19:05 np0005601977 podman[204454]: 2026-01-30 09:19:05.271944357 +0000 UTC m=+0.074348004 container exec b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:19:05 np0005601977 podman[204454]: 2026-01-30 09:19:05.302183131 +0000 UTC m=+0.104586748 container exec_died b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:19:05 np0005601977 systemd[1]: libpod-conmon-b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91.scope: Deactivated successfully.
Jan 30 04:19:05 np0005601977 python3.9[204635]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:06 np0005601977 systemd[1]: Started libpod-conmon-b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91.scope.
Jan 30 04:19:06 np0005601977 podman[204636]: 2026-01-30 09:19:06.036351684 +0000 UTC m=+0.065117261 container exec b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 30 04:19:06 np0005601977 podman[204636]: 2026-01-30 09:19:06.070486519 +0000 UTC m=+0.099252166 container exec_died b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:19:06 np0005601977 systemd[1]: libpod-conmon-b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91.scope: Deactivated successfully.
Jan 30 04:19:06 np0005601977 python3.9[204818]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:07 np0005601977 python3.9[204970]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 30 04:19:07 np0005601977 podman[205105]: 2026-01-30 09:19:07.856458568 +0000 UTC m=+0.125612958 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:19:07 np0005601977 python3.9[205146]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:08 np0005601977 systemd[1]: Started libpod-conmon-2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13.scope.
Jan 30 04:19:08 np0005601977 podman[205160]: 2026-01-30 09:19:08.083235323 +0000 UTC m=+0.090354221 container exec 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:08 np0005601977 podman[205160]: 2026-01-30 09:19:08.116520044 +0000 UTC m=+0.123638912 container exec_died 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:08 np0005601977 systemd[1]: libpod-conmon-2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13.scope: Deactivated successfully.
Jan 30 04:19:08 np0005601977 python3.9[205342]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:08 np0005601977 systemd[1]: Started libpod-conmon-2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13.scope.
Jan 30 04:19:08 np0005601977 podman[205343]: 2026-01-30 09:19:08.847959151 +0000 UTC m=+0.093827601 container exec 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:19:08 np0005601977 podman[205343]: 2026-01-30 09:19:08.88084253 +0000 UTC m=+0.126711000 container exec_died 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:19:08 np0005601977 systemd[1]: libpod-conmon-2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13.scope: Deactivated successfully.
Jan 30 04:19:09 np0005601977 python3.9[205525]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:10 np0005601977 python3.9[205677]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 30 04:19:10 np0005601977 python3.9[205843]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:10 np0005601977 systemd[1]: Started libpod-conmon-c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32.scope.
Jan 30 04:19:10 np0005601977 podman[205844]: 2026-01-30 09:19:10.872111349 +0000 UTC m=+0.088382635 container exec c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:11 np0005601977 podman[205864]: 2026-01-30 09:19:11.006672651 +0000 UTC m=+0.127445810 container exec_died c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:11 np0005601977 podman[205844]: 2026-01-30 09:19:11.018035596 +0000 UTC m=+0.234306902 container exec_died c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:19:11 np0005601977 systemd[1]: libpod-conmon-c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32.scope: Deactivated successfully.
Jan 30 04:19:11 np0005601977 python3.9[206028]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:11 np0005601977 systemd[1]: Started libpod-conmon-c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32.scope.
Jan 30 04:19:11 np0005601977 podman[206029]: 2026-01-30 09:19:11.701861683 +0000 UTC m=+0.060280053 container exec c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:11 np0005601977 podman[206048]: 2026-01-30 09:19:11.760411855 +0000 UTC m=+0.050513424 container exec_died c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:19:11 np0005601977 podman[206029]: 2026-01-30 09:19:11.765086108 +0000 UTC m=+0.123504448 container exec_died c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:11 np0005601977 systemd[1]: libpod-conmon-c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32.scope: Deactivated successfully.
Jan 30 04:19:12 np0005601977 python3.9[206213]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:12 np0005601977 python3.9[206365]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 30 04:19:13 np0005601977 python3.9[206527]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:13 np0005601977 systemd[1]: Started libpod-conmon-b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941.scope.
Jan 30 04:19:13 np0005601977 podman[206528]: 2026-01-30 09:19:13.732847367 +0000 UTC m=+0.091945147 container exec b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:19:13 np0005601977 podman[206528]: 2026-01-30 09:19:13.763688828 +0000 UTC m=+0.122786618 container exec_died b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855)
Jan 30 04:19:13 np0005601977 systemd[1]: libpod-conmon-b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941.scope: Deactivated successfully.
Jan 30 04:19:13 np0005601977 podman[206545]: 2026-01-30 09:19:13.816022732 +0000 UTC m=+0.082641471 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:14 np0005601977 python3.9[206730]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:14 np0005601977 systemd[1]: Started libpod-conmon-b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941.scope.
Jan 30 04:19:14 np0005601977 podman[206731]: 2026-01-30 09:19:14.450545191 +0000 UTC m=+0.061258050 container exec b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1769056855, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:19:14 np0005601977 podman[206731]: 2026-01-30 09:19:14.481543436 +0000 UTC m=+0.092256305 container exec_died b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 30 04:19:14 np0005601977 systemd[1]: libpod-conmon-b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941.scope: Deactivated successfully.
Jan 30 04:19:15 np0005601977 python3.9[206915]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:23 np0005601977 podman[206940]: 2026-01-30 09:19:23.849087227 +0000 UTC m=+0.066611103 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855)
Jan 30 04:19:26 np0005601977 podman[206962]: 2026-01-30 09:19:26.838471339 +0000 UTC m=+0.057000269 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:19:32 np0005601977 podman[206984]: 2026-01-30 09:19:32.837598723 +0000 UTC m=+0.056331629 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:33 np0005601977 podman[207009]: 2026-01-30 09:19:33.831125444 +0000 UTC m=+0.051152302 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 30 04:19:38 np0005601977 podman[207029]: 2026-01-30 09:19:38.900722167 +0000 UTC m=+0.118852345 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:19:44 np0005601977 podman[207056]: 2026-01-30 09:19:44.846998293 +0000 UTC m=+0.065727948 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.370 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.370 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.370 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.371 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.536 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.537 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5957MB free_disk=73.39895248413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.538 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.538 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.598 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.599 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.621 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.634 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.636 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:19:53 np0005601977 nova_compute[183130]: 2026-01-30 09:19:53.636 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:54 np0005601977 nova_compute[183130]: 2026-01-30 09:19:54.635 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:54 np0005601977 nova_compute[183130]: 2026-01-30 09:19:54.636 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:54 np0005601977 podman[207080]: 2026-01-30 09:19:54.823768559 +0000 UTC m=+0.046267832 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, distribution-scope=public)
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.355 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:55 np0005601977 nova_compute[183130]: 2026-01-30 09:19:55.356 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:19:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601977 python3.9[207228]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:56 np0005601977 nova_compute[183130]: 2026-01-30 09:19:56.353 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:56 np0005601977 python3.9[207380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:19:56 np0005601977 python3.9[207503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764796.0733294-3885-96975142779853/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:19:57.372 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:19:57.372 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:19:57.373 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:57 np0005601977 podman[207627]: 2026-01-30 09:19:57.720510055 +0000 UTC m=+0.038508040 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:19:57 np0005601977 python3.9[207673]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:58 np0005601977 python3.9[207825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:19:59 np0005601977 python3.9[207903]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:59 np0005601977 python3.9[208055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:00 np0005601977 python3.9[208133]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.168bnuph recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:00 np0005601977 python3.9[208285]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:01 np0005601977 python3.9[208363]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:02 np0005601977 python3.9[208515]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:02 np0005601977 podman[208668]: 2026-01-30 09:20:02.915140348 +0000 UTC m=+0.046533769 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:20:03 np0005601977 python3[208669]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:20:03 np0005601977 python3.9[208844]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:03 np0005601977 podman[208922]: 2026-01-30 09:20:03.952553441 +0000 UTC m=+0.050633727 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:20:04 np0005601977 python3.9[208923]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:04 np0005601977 python3.9[209093]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:05 np0005601977 python3.9[209171]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:05 np0005601977 python3.9[209323]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:06 np0005601977 python3.9[209401]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:07 np0005601977 python3.9[209553]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:07 np0005601977 python3.9[209631]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:08 np0005601977 python3.9[209783]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:08 np0005601977 python3.9[209908]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764807.8582141-4260-59336077643777/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:09 np0005601977 podman[210032]: 2026-01-30 09:20:09.384923223 +0000 UTC m=+0.070808633 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:20:09 np0005601977 python3.9[210075]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:10 np0005601977 python3.9[210238]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:11 np0005601977 python3.9[210393]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:11 np0005601977 python3.9[210545]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:12 np0005601977 python3.9[210699]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:20:13 np0005601977 python3.9[210853]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:14 np0005601977 python3.9[211008]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:14 np0005601977 systemd[1]: session-26.scope: Deactivated successfully.
Jan 30 04:20:14 np0005601977 systemd[1]: session-26.scope: Consumed 1min 33.343s CPU time.
Jan 30 04:20:14 np0005601977 systemd-logind[809]: Session 26 logged out. Waiting for processes to exit.
Jan 30 04:20:14 np0005601977 systemd-logind[809]: Removed session 26.
Jan 30 04:20:15 np0005601977 podman[211033]: 2026-01-30 09:20:15.834226401 +0000 UTC m=+0.056866435 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:20:25 np0005601977 podman[211057]: 2026-01-30 09:20:25.836075533 +0000 UTC m=+0.055309060 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 30 04:20:27 np0005601977 podman[211078]: 2026-01-30 09:20:27.827341184 +0000 UTC m=+0.051128581 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:20:33 np0005601977 podman[211098]: 2026-01-30 09:20:33.833268604 +0000 UTC m=+0.052210242 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:20:34 np0005601977 podman[211123]: 2026-01-30 09:20:34.84555604 +0000 UTC m=+0.060233181 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 30 04:20:39 np0005601977 podman[211143]: 2026-01-30 09:20:39.877913528 +0000 UTC m=+0.096860707 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 30 04:20:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:40.106 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:20:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:40.108 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:20:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:40.109 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:20:46 np0005601977 podman[211169]: 2026-01-30 09:20:46.827699525 +0000 UTC m=+0.047855519 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.338 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.354 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.377 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.503 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.504 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6020MB free_disk=73.39825820922852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.505 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.505 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.574 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.575 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.614 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.628 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.631 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:20:53 np0005601977 nova_compute[183130]: 2026-01-30 09:20:53.632 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:55 np0005601977 nova_compute[183130]: 2026-01-30 09:20:55.621 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601977 nova_compute[183130]: 2026-01-30 09:20:55.621 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:20:55 np0005601977 nova_compute[183130]: 2026-01-30 09:20:55.622 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:20:55 np0005601977 nova_compute[183130]: 2026-01-30 09:20:55.637 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:20:56 np0005601977 nova_compute[183130]: 2026-01-30 09:20:56.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:56 np0005601977 nova_compute[183130]: 2026-01-30 09:20:56.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:56 np0005601977 nova_compute[183130]: 2026-01-30 09:20:56.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:56 np0005601977 nova_compute[183130]: 2026-01-30 09:20:56.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:56 np0005601977 podman[211194]: 2026-01-30 09:20:56.859629777 +0000 UTC m=+0.069814393 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Jan 30 04:20:57 np0005601977 nova_compute[183130]: 2026-01-30 09:20:57.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:57 np0005601977 nova_compute[183130]: 2026-01-30 09:20:57.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:57 np0005601977 nova_compute[183130]: 2026-01-30 09:20:57.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:57 np0005601977 nova_compute[183130]: 2026-01-30 09:20:57.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:20:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:57.372 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:57.373 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:20:57.373 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:58 np0005601977 podman[211215]: 2026-01-30 09:20:58.880965914 +0000 UTC m=+0.095114990 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:21:04 np0005601977 podman[211235]: 2026-01-30 09:21:04.845516051 +0000 UTC m=+0.056204406 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:21:05 np0005601977 podman[211259]: 2026-01-30 09:21:05.82800567 +0000 UTC m=+0.049018232 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:21:10 np0005601977 podman[211279]: 2026-01-30 09:21:10.880307953 +0000 UTC m=+0.099552636 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:21:17 np0005601977 podman[211306]: 2026-01-30 09:21:17.818632852 +0000 UTC m=+0.042240320 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:21:27 np0005601977 podman[211331]: 2026-01-30 09:21:27.830086104 +0000 UTC m=+0.053842119 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:21:29 np0005601977 podman[211352]: 2026-01-30 09:21:29.832620546 +0000 UTC m=+0.056188016 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:21:35 np0005601977 podman[211372]: 2026-01-30 09:21:35.842391319 +0000 UTC m=+0.060100827 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:21:35 np0005601977 podman[211396]: 2026-01-30 09:21:35.914435844 +0000 UTC m=+0.045513183 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 30 04:21:41 np0005601977 podman[211415]: 2026-01-30 09:21:41.86769299 +0000 UTC m=+0.079128587 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 30 04:21:48 np0005601977 podman[211441]: 2026-01-30 09:21:48.833159139 +0000 UTC m=+0.052112660 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.367 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.514 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.516 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6073MB free_disk=73.39825820922852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.516 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.516 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.614 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.614 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.637 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.650 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.651 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.652 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.652 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.652 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.664 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.664 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.664 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:21:53 np0005601977 nova_compute[183130]: 2026-01-30 09:21:53.683 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:21:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:56 np0005601977 nova_compute[183130]: 2026-01-30 09:21:56.695 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:56 np0005601977 nova_compute[183130]: 2026-01-30 09:21:56.695 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:21:56 np0005601977 nova_compute[183130]: 2026-01-30 09:21:56.695 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:21:56 np0005601977 nova_compute[183130]: 2026-01-30 09:21:56.709 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:21:56 np0005601977 nova_compute[183130]: 2026-01-30 09:21:56.710 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:57 np0005601977 nova_compute[183130]: 2026-01-30 09:21:57.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:57 np0005601977 nova_compute[183130]: 2026-01-30 09:21:57.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:21:57.374 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:21:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:21:57.374 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:21:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:21:57.375 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:21:58 np0005601977 nova_compute[183130]: 2026-01-30 09:21:58.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:58 np0005601977 nova_compute[183130]: 2026-01-30 09:21:58.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:58 np0005601977 nova_compute[183130]: 2026-01-30 09:21:58.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:58 np0005601977 podman[211466]: 2026-01-30 09:21:58.831804555 +0000 UTC m=+0.052414289 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, release=1769056855, version=9.7)
Jan 30 04:21:59 np0005601977 nova_compute[183130]: 2026-01-30 09:21:59.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:59 np0005601977 nova_compute[183130]: 2026-01-30 09:21:59.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:22:00 np0005601977 podman[211488]: 2026-01-30 09:22:00.832057031 +0000 UTC m=+0.052394780 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:22:06 np0005601977 podman[211509]: 2026-01-30 09:22:06.825499291 +0000 UTC m=+0.045385065 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:22:06 np0005601977 podman[211508]: 2026-01-30 09:22:06.86489573 +0000 UTC m=+0.085837503 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:22:12 np0005601977 podman[211551]: 2026-01-30 09:22:12.868750607 +0000 UTC m=+0.083923361 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller)
Jan 30 04:22:19 np0005601977 podman[211578]: 2026-01-30 09:22:19.8287539 +0000 UTC m=+0.047899265 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:22:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:22.796 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:22:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:22.797 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:22:29 np0005601977 podman[211602]: 2026-01-30 09:22:29.854295171 +0000 UTC m=+0.078065498 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9/ubi-minimal, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Jan 30 04:22:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:30.799 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:31 np0005601977 podman[211624]: 2026-01-30 09:22:31.832044112 +0000 UTC m=+0.052762230 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:22:37 np0005601977 podman[211645]: 2026-01-30 09:22:37.836984962 +0000 UTC m=+0.051099663 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:22:37 np0005601977 podman[211646]: 2026-01-30 09:22:37.836994642 +0000 UTC m=+0.049314004 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:22:43 np0005601977 podman[211684]: 2026-01-30 09:22:43.943057116 +0000 UTC m=+0.160899308 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 30 04:22:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:44.971 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:b8:09 10.100.0.2 2001:db8::f816:3eff:fe90:b809'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe90:b809/64', 'neutron:device_id': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12e8e746-8e5c-4e29-a519-42408cc3b2d9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3fa91002-4287-41d0-8a0e-e00f676bd48b) old=Port_Binding(mac=['fa:16:3e:90:b8:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:22:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:44.974 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3fa91002-4287-41d0-8a0e-e00f676bd48b in datapath 175868ce-3812-409c-871e-725dea7b3f30 updated#033[00m
Jan 30 04:22:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:44.979 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 175868ce-3812-409c-871e-725dea7b3f30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:22:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:44.981 104706 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp65ncbeco/privsep.sock']#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.622 104706 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.623 104706 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp65ncbeco/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.497 211716 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.502 211716 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.505 211716 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.506 211716 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211716#033[00m
Jan 30 04:22:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:45.625 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[32841d59-2432-42f0-91f8-4540ac468a89]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:46.216 211716 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:46.217 211716 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:46.217 211716 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:46.325 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[926146e9-7396-4160-86cf-0444369afe70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:47 np0005601977 nova_compute[183130]: 2026-01-30 09:22:47.837 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:47 np0005601977 nova_compute[183130]: 2026-01-30 09:22:47.837 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:47 np0005601977 nova_compute[183130]: 2026-01-30 09:22:47.861 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.080 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.081 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.090 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.090 183134 INFO nova.compute.claims [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.272 183134 DEBUG nova.scheduler.client.report [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.333 183134 DEBUG nova.scheduler.client.report [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.334 183134 DEBUG nova.compute.provider_tree [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.360 183134 DEBUG nova.scheduler.client.report [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.361 183134 DEBUG nova.compute.provider_tree [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Updating resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 generation from 3 to 4 during operation: update_aggregates _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.394 183134 DEBUG nova.scheduler.client.report [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.485 183134 DEBUG nova.compute.provider_tree [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.499 183134 DEBUG nova.scheduler.client.report [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.521 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.521 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.551 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.552 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.574 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.575 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.583 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.609 183134 INFO nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.631 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.703 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.704 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.713 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.713 183134 INFO nova.compute.claims [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.787 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.788 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.789 183134 INFO nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Creating image(s)#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.789 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.790 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.790 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.791 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.792 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.878 183134 DEBUG nova.compute.provider_tree [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.896 183134 DEBUG nova.scheduler.client.report [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.919 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.920 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.968 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.969 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:22:48 np0005601977 nova_compute[183130]: 2026-01-30 09:22:48.993 183134 INFO nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.010 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.088 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.090 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.090 183134 INFO nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Creating image(s)#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.091 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.091 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.092 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.092 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.177 183134 WARNING oslo_policy.policy [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.178 183134 WARNING oslo_policy.policy [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.181 183134 DEBUG nova.policy [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:22:49 np0005601977 nova_compute[183130]: 2026-01-30 09:22:49.396 183134 DEBUG nova.policy [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.367 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.405 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Successfully created port: b77bd158-fad4-4c68-8373-a447c84d330b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.415 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.416 183134 DEBUG nova.virt.images [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] ab7cf61b-98df-4a10-83fd-7d23191f2bba was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.417 183134 DEBUG nova.privsep.utils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.417 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.555 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.557 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.635 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.636 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.648 183134 INFO oslo.privsep.daemon [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpdqbgbmkw/privsep.sock']#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.649 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:50 np0005601977 nova_compute[183130]: 2026-01-30 09:22:50.649 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:50 np0005601977 podman[211742]: 2026-01-30 09:22:50.82997961 +0000 UTC m=+0.049268923 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.296 183134 INFO oslo.privsep.daemon [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.170 211768 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.177 211768 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.182 211768 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.182 211768 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211768#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.299 183134 WARNING oslo_privsep.priv_context [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] privsep daemon already running#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.377 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.390 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.423 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.424 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.424 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.439 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.449 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.450 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.491 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.492 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.514 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.515 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.516 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.527 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.538 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.560 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.561 183134 DEBUG nova.virt.disk.api [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.562 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.580 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Successfully created port: cc981c4c-eb1f-420a-9f46-bfb505a4df87 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.597 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.599 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.609 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.610 183134 DEBUG nova.virt.disk.api [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.611 183134 DEBUG nova.objects.instance [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 43ceb724-51a7-4484-b588-85747155f2fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.632 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.633 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Ensure instance console log exists: /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.634 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.634 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.634 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.702 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk 1073741824" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.703 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.704 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.718 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Successfully updated port: b77bd158-fad4-4c68-8373-a447c84d330b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.737 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.737 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.738 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.761 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.762 183134 DEBUG nova.virt.disk.api [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.762 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.817 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.818 183134 DEBUG nova.virt.disk.api [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.819 183134 DEBUG nova.objects.instance [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.834 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.834 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Ensure instance console log exists: /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.834 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.835 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:51 np0005601977 nova_compute[183130]: 2026-01-30 09:22:51.835 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:52 np0005601977 nova_compute[183130]: 2026-01-30 09:22:52.012 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:22:52 np0005601977 nova_compute[183130]: 2026-01-30 09:22:52.764 183134 DEBUG nova.compute.manager [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:22:52 np0005601977 nova_compute[183130]: 2026-01-30 09:22:52.765 183134 DEBUG nova.compute.manager [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing instance network info cache due to event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:22:52 np0005601977 nova_compute[183130]: 2026-01-30 09:22:52.765 183134 DEBUG oslo_concurrency.lockutils [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.087 183134 DEBUG nova.network.neutron [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.337 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.338 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance network_info: |[{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.342 183134 DEBUG oslo_concurrency.lockutils [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.342 183134 DEBUG nova.network.neutron [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.348 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Start _get_guest_xml network_info=[{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.356 183134 WARNING nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.364 183134 DEBUG nova.virt.libvirt.host [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.365 183134 DEBUG nova.virt.libvirt.host [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.371 183134 DEBUG nova.virt.libvirt.host [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.372 183134 DEBUG nova.virt.libvirt.host [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.374 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.375 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.376 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.377 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.378 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.378 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.379 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.379 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.380 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.381 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.381 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.382 183134 DEBUG nova.virt.hardware [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.388 183134 DEBUG nova.privsep.utils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.390 183134 DEBUG nova.virt.libvirt.vif [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1800673208',display_name='tempest-TestNetworkAdvancedServerOps-server-1800673208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1800673208',id=2,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBWMmOcexm/n60a0wV+ZCzqayVRcO1K6INCRcfXObsrkDu/ozQQv1ArDS1c2ehfKYVYO75n/FnjcB6h7Xs5xpJrmNoh4WV8d5imOfuVdju5GoBbfSHZMPiM0NrBf/MnozQ==',key_name='tempest-TestNetworkAdvancedServerOps-1994586608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8bukctap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:22:48Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=43ceb724-51a7-4484-b588-85747155f2fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.391 183134 DEBUG nova.network.os_vif_util [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.392 183134 DEBUG nova.network.os_vif_util [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.395 183134 DEBUG nova.objects.instance [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 43ceb724-51a7-4484-b588-85747155f2fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.408 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.420 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <uuid>43ceb724-51a7-4484-b588-85747155f2fe</uuid>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <name>instance-00000002</name>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1800673208</nova:name>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:22:53</nova:creationTime>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        <nova:port uuid="b77bd158-fad4-4c68-8373-a447c84d330b">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="serial">43ceb724-51a7-4484-b588-85747155f2fe</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="uuid">43ceb724-51a7-4484-b588-85747155f2fe</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:c4:3d:87"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <target dev="tapb77bd158-fa"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/console.log" append="off"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:22:53 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:22:53 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:22:53 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:22:53 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.421 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Preparing to wait for external event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.421 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.422 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.422 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.423 183134 DEBUG nova.virt.libvirt.vif [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1800673208',display_name='tempest-TestNetworkAdvancedServerOps-server-1800673208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1800673208',id=2,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBWMmOcexm/n60a0wV+ZCzqayVRcO1K6INCRcfXObsrkDu/ozQQv1ArDS1c2ehfKYVYO75n/FnjcB6h7Xs5xpJrmNoh4WV8d5imOfuVdju5GoBbfSHZMPiM0NrBf/MnozQ==',key_name='tempest-TestNetworkAdvancedServerOps-1994586608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8bukctap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:22:48Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=43ceb724-51a7-4484-b588-85747155f2fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.423 183134 DEBUG nova.network.os_vif_util [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.424 183134 DEBUG nova.network.os_vif_util [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.424 183134 DEBUG os_vif [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.452 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.453 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.453 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.454 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.464 183134 DEBUG ovsdbapp.backend.ovs_idl [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.465 183134 DEBUG ovsdbapp.backend.ovs_idl [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.465 183134 DEBUG ovsdbapp.backend.ovs_idl [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.466 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.467 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [POLLOUT] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.468 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.470 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.472 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.475 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.494 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.495 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.495 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.496 183134 INFO oslo.privsep.daemon [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpq9dv7141/privsep.sock']#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.651 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.652 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5920MB free_disk=73.36364364624023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.652 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.652 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.734 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 43ceb724-51a7-4484-b588-85747155f2fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.734 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.735 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.735 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.859 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.901 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updated inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.901 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.901 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.929 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:22:53 np0005601977 nova_compute[183130]: 2026-01-30 09:22:53.929 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.132 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Successfully updated port: cc981c4c-eb1f-420a-9f46-bfb505a4df87 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.136 183134 INFO oslo.privsep.daemon [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.032 211805 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.036 211805 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.037 211805 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.038 211805 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211805#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.156 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.157 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.157 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.451 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.451 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb77bd158-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.453 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb77bd158-fa, col_values=(('external_ids', {'iface-id': 'b77bd158-fad4-4c68-8373-a447c84d330b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:3d:87', 'vm-uuid': '43ceb724-51a7-4484-b588-85747155f2fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.455 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:54 np0005601977 NetworkManager[55565]: <info>  [1769764974.4574] manager: (tapb77bd158-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.459 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.465 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.466 183134 INFO os_vif [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa')#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.493 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.525 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.525 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.526 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:c4:3d:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.527 183134 INFO nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Using config drive#033[00m
Jan 30 04:22:54 np0005601977 nova_compute[183130]: 2026-01-30 09:22:54.615 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.000 183134 DEBUG nova.compute.manager [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.000 183134 DEBUG nova.compute.manager [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing instance network info cache due to event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.000 183134 DEBUG oslo_concurrency.lockutils [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.086 183134 DEBUG nova.network.neutron [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updated VIF entry in instance network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.087 183134 DEBUG nova.network.neutron [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.105 183134 DEBUG oslo_concurrency.lockutils [req-c0469422-ebe6-433e-b8d6-60ed71f06a5e req-f6238895-66d0-4aea-8261-4f3d258161f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.827 183134 INFO nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Creating config drive at /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.833 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchgnhx95 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:55 np0005601977 nova_compute[183130]: 2026-01-30 09:22:55.957 183134 DEBUG oslo_concurrency.processutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchgnhx95" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:56 np0005601977 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 30 04:22:56 np0005601977 kernel: tapb77bd158-fa: entered promiscuous mode
Jan 30 04:22:56 np0005601977 NetworkManager[55565]: <info>  [1769764976.0138] manager: (tapb77bd158-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 30 04:22:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:56Z|00027|binding|INFO|Claiming lport b77bd158-fad4-4c68-8373-a447c84d330b for this chassis.
Jan 30 04:22:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:56Z|00028|binding|INFO|b77bd158-fad4-4c68-8373-a447c84d330b: Claiming fa:16:3e:c4:3d:87 10.100.0.9
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.015 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.018 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.032 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:3d:87 10.100.0.9'], port_security=['fa:16:3e:c4:3d:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '791b8d70-6f70-41a1-adc4-21257a54b33a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15b22b09-1a47-4869-84dd-49a1c18840ef, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b77bd158-fad4-4c68-8373-a447c84d330b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.033 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b77bd158-fad4-4c68-8373-a447c84d330b in datapath 7f7c04f6-63be-4d15-9767-329e1266cf2c bound to our chassis#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.035 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f7c04f6-63be-4d15-9767-329e1266cf2c#033[00m
Jan 30 04:22:56 np0005601977 systemd-udevd[211831]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.058 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:56 np0005601977 NetworkManager[55565]: <info>  [1769764976.0640] device (tapb77bd158-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:22:56 np0005601977 NetworkManager[55565]: <info>  [1769764976.0649] device (tapb77bd158-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:22:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:56Z|00029|binding|INFO|Setting lport b77bd158-fad4-4c68-8373-a447c84d330b ovn-installed in OVS
Jan 30 04:22:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:56Z|00030|binding|INFO|Setting lport b77bd158-fad4-4c68-8373-a447c84d330b up in Southbound
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:56 np0005601977 systemd-machined[154431]: New machine qemu-1-instance-00000002.
Jan 30 04:22:56 np0005601977 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.449 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[36086586-ad8a-4b0e-9c10-641c8762cf99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.450 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7f7c04f6-61 in ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.452 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7f7c04f6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.452 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2cae6838-8d36-4e8c-995b-1291c8fbb25f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.453 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4af805d0-a643-4692-a05d-4019fed411f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.478 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[836ed95d-0b50-4117-9371-fa246e45c7b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.506 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6afa68-57e6-4495-888c-dddec8d41eea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:56.507 104706 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpe5bxg3p8/privsep.sock']#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.865 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.866 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.866 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.894 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.895 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:22:56 np0005601977 nova_compute[183130]: 2026-01-30 09:22:56.895 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.210 183134 DEBUG nova.network.neutron [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.229 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.230 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Instance network_info: |[{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.230 183134 DEBUG oslo_concurrency.lockutils [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.230 183134 DEBUG nova.network.neutron [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing network info cache for port cc981c4c-eb1f-420a-9f46-bfb505a4df87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.233 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Start _get_guest_xml network_info=[{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.238 183134 WARNING nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.245 183134 DEBUG nova.virt.libvirt.host [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.246 183134 DEBUG nova.virt.libvirt.host [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.253 183134 DEBUG nova.virt.libvirt.host [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.253 183134 DEBUG nova.virt.libvirt.host [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.254 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.254 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.255 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.255 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.255 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.256 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.256 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.256 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.256 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.257 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.257 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.257 183134 DEBUG nova.virt.hardware [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.260 183134 DEBUG nova.virt.libvirt.vif [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=3,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTozGo/30MBHMhC0qqyUiXOz453HTS7rAR7GWNweWLgqYHZeswuKN04I0U5cVjAR8MaACmZTSTwsv0g1uGvof8aM0e7Q3swPJgiJmxXuknrNVi52vMRGO+/vfh0PdpnEw==',key_name='tempest-TestSecurityGroupsBasicOps-1644744006',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-gwxz4owx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:22:49Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=7f7f740b-9b5a-4141-8bd5-a4c35a68eab6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.260 104706 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.261 104706 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpe5bxg3p8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.261 183134 DEBUG nova.network.os_vif_util [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.262 183134 DEBUG nova.network.os_vif_util [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.142 211854 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.150 211854 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.155 211854 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.155 211854 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211854#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.263 183134 DEBUG nova.objects.instance [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.265 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[46db9325-3e92-44b0-a505-5d2fc0b4dda6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.281 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <uuid>7f7f740b-9b5a-4141-8bd5-a4c35a68eab6</uuid>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <name>instance-00000003</name>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659</nova:name>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:22:57</nova:creationTime>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        <nova:port uuid="cc981c4c-eb1f-420a-9f46-bfb505a4df87">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="serial">7f7f740b-9b5a-4141-8bd5-a4c35a68eab6</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="uuid">7f7f740b-9b5a-4141-8bd5-a4c35a68eab6</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.config"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:bf:8e:0a"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <target dev="tapcc981c4c-eb"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/console.log" append="off"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:22:57 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:22:57 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:22:57 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:22:57 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.285 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Preparing to wait for external event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.285 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.285 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.286 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.286 183134 DEBUG nova.virt.libvirt.vif [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=3,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTozGo/30MBHMhC0qqyUiXOz453HTS7rAR7GWNweWLgqYHZeswuKN04I0U5cVjAR8MaACmZTSTwsv0g1uGvof8aM0e7Q3swPJgiJmxXuknrNVi52vMRGO+/vfh0PdpnEw==',key_name='tempest-TestSecurityGroupsBasicOps-1644744006',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-gwxz4owx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:22:49Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=7f7f740b-9b5a-4141-8bd5-a4c35a68eab6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.287 183134 DEBUG nova.network.os_vif_util [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.287 183134 DEBUG nova.network.os_vif_util [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.288 183134 DEBUG os_vif [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.288 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.288 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.289 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.292 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.292 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc981c4c-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.292 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc981c4c-eb, col_values=(('external_ids', {'iface-id': 'cc981c4c-eb1f-420a-9f46-bfb505a4df87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:8e:0a', 'vm-uuid': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.294 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:57 np0005601977 NetworkManager[55565]: <info>  [1769764977.2949] manager: (tapcc981c4c-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.296 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.300 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.301 183134 INFO os_vif [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb')#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.364 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.364 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.365 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:bf:8e:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.365 183134 INFO nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Using config drive#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.375 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.376 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.376 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.432 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764977.431616, 43ceb724-51a7-4484-b588-85747155f2fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.432 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] VM Started (Lifecycle Event)#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.471 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.490 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764977.4326203, 43ceb724-51a7-4484-b588-85747155f2fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.490 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.506 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.511 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:22:57 np0005601977 nova_compute[183130]: 2026-01-30 09:22:57.537 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.790 211854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.790 211854 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:57.790 211854 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.308 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0583f6-2ed5-4efa-bb91-88edca44c8f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.3240] manager: (tap7f7c04f6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.323 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9e90c2b3-448b-47e9-a833-f467c54c5231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.344 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e61899af-2935-4534-bb17-a75f15ce0e13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.347 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5ade7383-cbc0-42f0-9df5-92a9652a4934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.3632] device (tap7f7c04f6-60): carrier: link connected
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.365 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[546b8b39-a53d-4241-9d6e-bc51278ab8e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.380 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[76bf3948-1608-4662-aa85-170130d7d496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f7c04f6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:72:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350970, 'reachable_time': 25052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211892, 'error': None, 'target': 'ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.393 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4e41c9-eff5-48b0-9843-aa4643ca6efe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:7275'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 350970, 'tstamp': 350970}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211893, 'error': None, 'target': 'ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.403 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9768b7a2-69fc-4e92-8117-3b969fcbb0c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f7c04f6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:72:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350970, 'reachable_time': 25052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211894, 'error': None, 'target': 'ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.418 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f528633e-21a6-4737-9ef3-ff842030bad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.447 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[26d75ee3-c94b-4387-9e76-b00ffce9c6be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.448 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f7c04f6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.449 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.449 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f7c04f6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.451 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 kernel: tap7f7c04f6-60: entered promiscuous mode
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.4522] manager: (tap7f7c04f6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.454 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.455 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f7c04f6-60, col_values=(('external_ids', {'iface-id': '9598cf1d-de50-4fa0-9888-e47c067fd409'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.456 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:58Z|00031|binding|INFO|Releasing lport 9598cf1d-de50-4fa0-9888-e47c067fd409 from this chassis (sb_readonly=0)
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.461 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.463 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.464 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f7c04f6-63be-4d15-9767-329e1266cf2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f7c04f6-63be-4d15-9767-329e1266cf2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.465 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d65a92c9-f291-4e77-90e2-62b40e81799f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.467 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-7f7c04f6-63be-4d15-9767-329e1266cf2c
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/7f7c04f6-63be-4d15-9767-329e1266cf2c.pid.haproxy
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 7f7c04f6-63be-4d15-9767-329e1266cf2c
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.468 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'env', 'PROCESS_TAG=haproxy-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7f7c04f6-63be-4d15-9767-329e1266cf2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.736 183134 INFO nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Creating config drive at /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.config#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.740 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cu9065m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:22:58 np0005601977 podman[211928]: 2026-01-30 09:22:58.817504342 +0000 UTC m=+0.062815307 container create f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 30 04:22:58 np0005601977 systemd[1]: Started libpod-conmon-f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03.scope.
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.861 183134 DEBUG oslo_concurrency.processutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cu9065m" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:22:58 np0005601977 podman[211928]: 2026-01-30 09:22:58.779590324 +0000 UTC m=+0.024901329 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:22:58 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:22:58 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4b6a41df0d2fb996237288f1f591e584084ee5de9a88a81192ef09bbff5fa93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:22:58 np0005601977 podman[211928]: 2026-01-30 09:22:58.908475927 +0000 UTC m=+0.153786962 container init f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:22:58 np0005601977 podman[211928]: 2026-01-30 09:22:58.916248841 +0000 UTC m=+0.161559806 container start f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:22:58 np0005601977 kernel: tapcc981c4c-eb: entered promiscuous mode
Jan 30 04:22:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:58Z|00032|binding|INFO|Claiming lport cc981c4c-eb1f-420a-9f46-bfb505a4df87 for this chassis.
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.9317] manager: (tapcc981c4c-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Jan 30 04:22:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:58Z|00033|binding|INFO|cc981c4c-eb1f-420a-9f46-bfb505a4df87: Claiming fa:16:3e:bf:8e:0a 10.100.0.8
Jan 30 04:22:58 np0005601977 systemd-udevd[211879]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:22:58 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [NOTICE]   (211960) : New worker (211966) forked
Jan 30 04:22:58 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [NOTICE]   (211960) : Loading success.
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.935 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.940 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.9471] device (tapcc981c4c-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:22:58 np0005601977 NetworkManager[55565]: <info>  [1769764978.9482] device (tapcc981c4c-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.950 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:8e:0a 10.100.0.8'], port_security=['fa:16:3e:bf:8e:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '698009b6-3fdf-4d28-bd4f-05dad4fe7608 f44c0181-adb8-44dc-b12e-9a42af2bf3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b86d5d91-f387-44cf-8812-d69fa2c0ba06, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=cc981c4c-eb1f-420a-9f46-bfb505a4df87) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:22:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:58Z|00034|binding|INFO|Setting lport cc981c4c-eb1f-420a-9f46-bfb505a4df87 ovn-installed in OVS
Jan 30 04:22:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:58Z|00035|binding|INFO|Setting lport cc981c4c-eb1f-420a-9f46-bfb505a4df87 up in Southbound
Jan 30 04:22:58 np0005601977 nova_compute[183130]: 2026-01-30 09:22:58.960 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:58 np0005601977 systemd-machined[154431]: New machine qemu-2-instance-00000003.
Jan 30 04:22:58 np0005601977 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.986 104706 INFO neutron.agent.ovn.metadata.agent [-] Port cc981c4c-eb1f-420a-9f46-bfb505a4df87 in datapath 25da4a49-e507-4d4f-9263-ce5e8dbdc544 unbound from our chassis#033[00m
Jan 30 04:22:58 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:58.989 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25da4a49-e507-4d4f-9263-ce5e8dbdc544#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.003 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bb610579-8ca5-40ae-beb7-9b45864d0ff5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.005 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap25da4a49-e1 in ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.007 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap25da4a49-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.007 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e68817e5-c664-43fd-9cd5-e17e5a94ac1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.009 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[20cec3b6-7c54-45ef-bc92-7df93c787b7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.029 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd186db-f420-4934-8553-2f8aa569c7d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.045 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2b540c18-4486-4a7f-991a-15943f117f6a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.073 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ef1c7a00-1d54-4045-b0e7-2d9e2f329c50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 NetworkManager[55565]: <info>  [1769764979.0791] manager: (tap25da4a49-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.084 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe908c9-ec4e-4084-9282-e82baaeb3c10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.106 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f2cd1191-0cc7-401a-99e7-4339fe111fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 systemd-udevd[211992]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.109 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[4253c842-5551-4ae3-8e46-20dd87a182a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 NetworkManager[55565]: <info>  [1769764979.1353] device (tap25da4a49-e0): carrier: link connected
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.140 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d2aa88b6-93ab-4c26-a340-93eadfc086de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.155 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7c40e7f2-be4e-4511-8c52-2a7204775317]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25da4a49-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:2f:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351047, 'reachable_time': 16756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212011, 'error': None, 'target': 'ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.172 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[059980b6-57cb-4797-a620-808e9953a079]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:2f3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 351047, 'tstamp': 351047}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212013, 'error': None, 'target': 'ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.187 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[60ff7b8b-5489-49b8-a4ee-cc23c18ee8b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25da4a49-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:2f:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351047, 'reachable_time': 16756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212015, 'error': None, 'target': 'ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.218 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccc4fbc-e89e-471a-bfbb-a5411e83e95f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.277 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764979.2774734, 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.278 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] VM Started (Lifecycle Event)#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.281 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[23c35ca5-76dc-49f0-af6e-e1db421b0919]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.282 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25da4a49-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.282 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.283 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25da4a49-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.284 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 kernel: tap25da4a49-e0: entered promiscuous mode
Jan 30 04:22:59 np0005601977 NetworkManager[55565]: <info>  [1769764979.2853] manager: (tap25da4a49-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.286 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.286 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25da4a49-e0, col_values=(('external_ids', {'iface-id': '1e1a9288-6756-4ed5-a91d-8ad95ed4e3ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.287 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 ovn_controller[95460]: 2026-01-30T09:22:59Z|00036|binding|INFO|Releasing lport 1e1a9288-6756-4ed5-a91d-8ad95ed4e3ae from this chassis (sb_readonly=0)
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.288 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.289 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/25da4a49-e507-4d4f-9263-ce5e8dbdc544.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/25da4a49-e507-4d4f-9263-ce5e8dbdc544.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.290 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[192e64d9-4868-4e2e-9489-4c4b12d769b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.290 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-25da4a49-e507-4d4f-9263-ce5e8dbdc544
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/25da4a49-e507-4d4f-9263-ce5e8dbdc544.pid.haproxy
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 25da4a49-e507-4d4f-9263-ce5e8dbdc544
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:22:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:22:59.292 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'env', 'PROCESS_TAG=haproxy-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/25da4a49-e507-4d4f-9263-ce5e8dbdc544.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.292 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.309 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.314 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764979.2785437, 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.315 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.331 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.336 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.360 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.617 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:22:59 np0005601977 podman[212053]: 2026-01-30 09:22:59.641442195 +0000 UTC m=+0.062047616 container create 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 30 04:22:59 np0005601977 systemd[1]: Started libpod-conmon-566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893.scope.
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.682 183134 DEBUG nova.network.neutron [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updated VIF entry in instance network info cache for port cc981c4c-eb1f-420a-9f46-bfb505a4df87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.683 183134 DEBUG nova.network.neutron [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:22:59 np0005601977 nova_compute[183130]: 2026-01-30 09:22:59.701 183134 DEBUG oslo_concurrency.lockutils [req-f9a2ae4c-19ef-4014-82f9-3d8b0db7b86d req-e6c26cce-a202-49c5-9802-1619d9c74afd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:22:59 np0005601977 podman[212053]: 2026-01-30 09:22:59.608851814 +0000 UTC m=+0.029457295 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:22:59 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:22:59 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77f74b7f5980594d92733de27819048695e13e091a12d29c346544b8ee891783/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:22:59 np0005601977 podman[212053]: 2026-01-30 09:22:59.729771376 +0000 UTC m=+0.150376837 container init 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:22:59 np0005601977 podman[212053]: 2026-01-30 09:22:59.734644091 +0000 UTC m=+0.155249532 container start 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:22:59 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [NOTICE]   (212072) : New worker (212074) forked
Jan 30 04:22:59 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [NOTICE]   (212072) : Loading success.
Jan 30 04:23:00 np0005601977 nova_compute[183130]: 2026-01-30 09:23:00.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:00 np0005601977 nova_compute[183130]: 2026-01-30 09:23:00.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:00 np0005601977 nova_compute[183130]: 2026-01-30 09:23:00.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:00 np0005601977 nova_compute[183130]: 2026-01-30 09:23:00.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:23:00 np0005601977 podman[212083]: 2026-01-30 09:23:00.879227646 +0000 UTC m=+0.092637292 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, release=1769056855, version=9.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal)
Jan 30 04:23:02 np0005601977 nova_compute[183130]: 2026-01-30 09:23:02.295 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:02 np0005601977 podman[212105]: 2026-01-30 09:23:02.856216987 +0000 UTC m=+0.064211466 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:23:04 np0005601977 nova_compute[183130]: 2026-01-30 09:23:04.619 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.451 183134 DEBUG nova.compute.manager [req-10852b3a-4c50-469f-a137-055a3c5a4070 req-cc063163-ceda-4d52-a51d-71f8e4f7e7e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.452 183134 DEBUG oslo_concurrency.lockutils [req-10852b3a-4c50-469f-a137-055a3c5a4070 req-cc063163-ceda-4d52-a51d-71f8e4f7e7e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.452 183134 DEBUG oslo_concurrency.lockutils [req-10852b3a-4c50-469f-a137-055a3c5a4070 req-cc063163-ceda-4d52-a51d-71f8e4f7e7e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.452 183134 DEBUG oslo_concurrency.lockutils [req-10852b3a-4c50-469f-a137-055a3c5a4070 req-cc063163-ceda-4d52-a51d-71f8e4f7e7e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.452 183134 DEBUG nova.compute.manager [req-10852b3a-4c50-469f-a137-055a3c5a4070 req-cc063163-ceda-4d52-a51d-71f8e4f7e7e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Processing event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.453 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.471 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.473 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764985.4728355, 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.473 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.480 183134 INFO nova.virt.libvirt.driver [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Instance spawned successfully.#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.480 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.545 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.549 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.573 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.574 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.574 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.575 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.575 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.575 183134 DEBUG nova.virt.libvirt.driver [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.599 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.702 183134 INFO nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Took 16.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.704 183134 DEBUG nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.848 183134 INFO nova.compute.manager [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Took 17.16 seconds to build instance.#033[00m
Jan 30 04:23:05 np0005601977 nova_compute[183130]: 2026-01-30 09:23:05.901 183134 DEBUG oslo_concurrency.lockutils [None req-68cb7baa-7b95-41b3-b763-7b4834aea0df 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:07 np0005601977 nova_compute[183130]: 2026-01-30 09:23:07.299 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.652 183134 DEBUG nova.compute.manager [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.654 183134 DEBUG oslo_concurrency.lockutils [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.655 183134 DEBUG oslo_concurrency.lockutils [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.655 183134 DEBUG oslo_concurrency.lockutils [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.656 183134 DEBUG nova.compute.manager [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] No waiting events found dispatching network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:23:08 np0005601977 nova_compute[183130]: 2026-01-30 09:23:08.657 183134 WARNING nova.compute.manager [req-a03d6462-dc39-45cf-84e2-11464240f81f req-d11fbc69-7be2-42fa-a065-20c33fedc9da dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received unexpected event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:23:08 np0005601977 podman[212128]: 2026-01-30 09:23:08.841706897 +0000 UTC m=+0.054543638 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:23:08 np0005601977 podman[212127]: 2026-01-30 09:23:08.865134615 +0000 UTC m=+0.083164470 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:23:09 np0005601977 nova_compute[183130]: 2026-01-30 09:23:09.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:11 np0005601977 nova_compute[183130]: 2026-01-30 09:23:11.698 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.6998] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7003] device (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <warn>  [1769764991.7004] device (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7008] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7011] device (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <warn>  [1769764991.7011] device (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7017] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7021] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7024] device (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 30 04:23:11 np0005601977 NetworkManager[55565]: <info>  [1769764991.7026] device (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 30 04:23:11 np0005601977 nova_compute[183130]: 2026-01-30 09:23:11.768 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:11Z|00037|binding|INFO|Releasing lport 1e1a9288-6756-4ed5-a91d-8ad95ed4e3ae from this chassis (sb_readonly=0)
Jan 30 04:23:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:11Z|00038|binding|INFO|Releasing lport 9598cf1d-de50-4fa0-9888-e47c067fd409 from this chassis (sb_readonly=0)
Jan 30 04:23:11 np0005601977 nova_compute[183130]: 2026-01-30 09:23:11.796 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.215 183134 DEBUG nova.compute.manager [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.216 183134 DEBUG nova.compute.manager [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing instance network info cache due to event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.216 183134 DEBUG oslo_concurrency.lockutils [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.216 183134 DEBUG oslo_concurrency.lockutils [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.217 183134 DEBUG nova.network.neutron [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing network info cache for port cc981c4c-eb1f-420a-9f46-bfb505a4df87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:12 np0005601977 nova_compute[183130]: 2026-01-30 09:23:12.301 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:14 np0005601977 nova_compute[183130]: 2026-01-30 09:23:14.066 183134 DEBUG nova.network.neutron [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updated VIF entry in instance network info cache for port cc981c4c-eb1f-420a-9f46-bfb505a4df87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:23:14 np0005601977 nova_compute[183130]: 2026-01-30 09:23:14.067 183134 DEBUG nova.network.neutron [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:14 np0005601977 nova_compute[183130]: 2026-01-30 09:23:14.086 183134 DEBUG oslo_concurrency.lockutils [req-5158e233-c4c5-4f1c-9fd4-6cb58cdd4fba req-e02e4ca0-55e7-4fba-b1c9-03cc452e2633 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:14 np0005601977 nova_compute[183130]: 2026-01-30 09:23:14.623 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:14 np0005601977 podman[212170]: 2026-01-30 09:23:14.856489397 +0000 UTC m=+0.069258966 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:23:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:16Z|00039|memory|INFO|peak resident set size grew 54% in last 827.7 seconds, from 16256 kB to 25088 kB
Jan 30 04:23:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:16Z|00040|memory|INFO|idl-cells-OVN_Southbound:11747 idl-cells-Open_vSwitch:870 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:428 lflow-cache-entries-cache-matches:311 lflow-cache-size-KB:1840 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:718 ofctrl_installed_flow_usage-KB:524 ofctrl_sb_flow_ref_usage-KB:271
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.874 183134 DEBUG nova.compute.manager [req-39318e95-b6de-46f6-b148-f43f0eb6e530 req-4979838d-1af1-48ae-b6cf-93d0e650b5e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.874 183134 DEBUG oslo_concurrency.lockutils [req-39318e95-b6de-46f6-b148-f43f0eb6e530 req-4979838d-1af1-48ae-b6cf-93d0e650b5e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.875 183134 DEBUG oslo_concurrency.lockutils [req-39318e95-b6de-46f6-b148-f43f0eb6e530 req-4979838d-1af1-48ae-b6cf-93d0e650b5e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.875 183134 DEBUG oslo_concurrency.lockutils [req-39318e95-b6de-46f6-b148-f43f0eb6e530 req-4979838d-1af1-48ae-b6cf-93d0e650b5e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.875 183134 DEBUG nova.compute.manager [req-39318e95-b6de-46f6-b148-f43f0eb6e530 req-4979838d-1af1-48ae-b6cf-93d0e650b5e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Processing event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.876 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance event wait completed in 19 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.879 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769764996.8791304, 43ceb724-51a7-4484-b588-85747155f2fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.879 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.881 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.883 183134 INFO nova.virt.libvirt.driver [-] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance spawned successfully.#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.883 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.922 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.927 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.928 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.929 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.930 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.930 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.931 183134 DEBUG nova.virt.libvirt.driver [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.938 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:23:16 np0005601977 nova_compute[183130]: 2026-01-30 09:23:16.983 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:23:17 np0005601977 nova_compute[183130]: 2026-01-30 09:23:17.013 183134 INFO nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Took 28.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:23:17 np0005601977 nova_compute[183130]: 2026-01-30 09:23:17.014 183134 DEBUG nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:23:17 np0005601977 nova_compute[183130]: 2026-01-30 09:23:17.075 183134 INFO nova.compute.manager [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Took 29.11 seconds to build instance.#033[00m
Jan 30 04:23:17 np0005601977 nova_compute[183130]: 2026-01-30 09:23:17.092 183134 DEBUG oslo_concurrency.lockutils [None req-11ac647e-88b6-4dfb-af22-8077de8d48dc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 29.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601977 nova_compute[183130]: 2026-01-30 09:23:17.305 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:17Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:8e:0a 10.100.0.8
Jan 30 04:23:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:17Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:8e:0a 10.100.0.8
Jan 30 04:23:19 np0005601977 nova_compute[183130]: 2026-01-30 09:23:19.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.152 183134 DEBUG nova.compute.manager [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.153 183134 DEBUG oslo_concurrency.lockutils [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.153 183134 DEBUG oslo_concurrency.lockutils [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.153 183134 DEBUG oslo_concurrency.lockutils [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.154 183134 DEBUG nova.compute.manager [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] No waiting events found dispatching network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:23:20 np0005601977 nova_compute[183130]: 2026-01-30 09:23:20.154 183134 WARNING nova.compute.manager [req-8750f849-20d8-4053-b4c6-daf9c56425c0 req-e523bc7b-7a60-4ffe-bd37-fa736272d655 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received unexpected event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b for instance with vm_state active and task_state None.#033[00m
Jan 30 04:23:21 np0005601977 podman[212213]: 2026-01-30 09:23:21.824949638 +0000 UTC m=+0.046145824 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:23:22 np0005601977 nova_compute[183130]: 2026-01-30 09:23:22.308 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:24.211 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:23:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:24.211 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.257 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.430 183134 DEBUG nova.compute.manager [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.431 183134 DEBUG nova.compute.manager [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing instance network info cache due to event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.431 183134 DEBUG oslo_concurrency.lockutils [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.431 183134 DEBUG oslo_concurrency.lockutils [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.431 183134 DEBUG nova.network.neutron [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:24 np0005601977 nova_compute[183130]: 2026-01-30 09:23:24.629 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:27 np0005601977 nova_compute[183130]: 2026-01-30 09:23:27.078 183134 DEBUG nova.network.neutron [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updated VIF entry in instance network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:23:27 np0005601977 nova_compute[183130]: 2026-01-30 09:23:27.078 183134 DEBUG nova.network.neutron [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:27 np0005601977 nova_compute[183130]: 2026-01-30 09:23:27.099 183134 DEBUG oslo_concurrency.lockutils [req-5c5f44ca-d2df-4b4d-83d0-6ac4424ba51f req-882a94d9-1112-430d-a066-f5def20a6847 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:27 np0005601977 nova_compute[183130]: 2026-01-30 09:23:27.312 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:29Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c4:3d:87 10.100.0.9
Jan 30 04:23:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:23:29Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:3d:87 10.100.0.9
Jan 30 04:23:29 np0005601977 nova_compute[183130]: 2026-01-30 09:23:29.631 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:31.213 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:23:31 np0005601977 podman[212259]: 2026-01-30 09:23:31.844874011 +0000 UTC m=+0.064265450 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7)
Jan 30 04:23:32 np0005601977 nova_compute[183130]: 2026-01-30 09:23:32.314 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:33 np0005601977 nova_compute[183130]: 2026-01-30 09:23:33.467 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:33 np0005601977 podman[212281]: 2026-01-30 09:23:33.854157855 +0000 UTC m=+0.059619118 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:23:34 np0005601977 nova_compute[183130]: 2026-01-30 09:23:34.631 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:34 np0005601977 nova_compute[183130]: 2026-01-30 09:23:34.850 183134 INFO nova.compute.manager [None req-ace3bd48-0014-4e89-9a21-756f4c622609 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Get console output#033[00m
Jan 30 04:23:34 np0005601977 nova_compute[183130]: 2026-01-30 09:23:34.969 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:23:37 np0005601977 nova_compute[183130]: 2026-01-30 09:23:37.317 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:39 np0005601977 nova_compute[183130]: 2026-01-30 09:23:39.634 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:39 np0005601977 podman[212301]: 2026-01-30 09:23:39.846944439 +0000 UTC m=+0.065525315 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 30 04:23:39 np0005601977 podman[212302]: 2026-01-30 09:23:39.847893576 +0000 UTC m=+0.054744758 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:23:42 np0005601977 nova_compute[183130]: 2026-01-30 09:23:42.321 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:42 np0005601977 nova_compute[183130]: 2026-01-30 09:23:42.837 183134 INFO nova.compute.manager [None req-fb68865a-9124-449b-86c9-c3fed869e707 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Get console output#033[00m
Jan 30 04:23:42 np0005601977 nova_compute[183130]: 2026-01-30 09:23:42.843 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:23:43 np0005601977 nova_compute[183130]: 2026-01-30 09:23:43.172 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:44 np0005601977 nova_compute[183130]: 2026-01-30 09:23:44.637 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:45 np0005601977 podman[212343]: 2026-01-30 09:23:45.880012819 +0000 UTC m=+0.094382156 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 30 04:23:47 np0005601977 nova_compute[183130]: 2026-01-30 09:23:47.324 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:49 np0005601977 nova_compute[183130]: 2026-01-30 09:23:49.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:52 np0005601977 nova_compute[183130]: 2026-01-30 09:23:52.328 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:52 np0005601977 podman[212370]: 2026-01-30 09:23:52.822237431 +0000 UTC m=+0.039938527 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:23:54 np0005601977 nova_compute[183130]: 2026-01-30 09:23:54.640 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.368 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.369 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.369 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.370 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.461 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.529 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.532 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.589 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.595 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.646 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.647 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.693 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:55.763 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b741677902cdc34801a38b7658be191fd284b6abc977e6eabd1767165f54f9cc" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.821 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.822 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5444MB free_disk=73.30557250976562GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.822 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.823 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.869 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating resource usage from migration 9bb9d894-ae85-443f-a27f-16301b77e220#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.897 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.898 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration 9bb9d894-ae85-443f-a27f-16301b77e220 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.898 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.899 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:23:55 np0005601977 nova_compute[183130]: 2026-01-30 09:23:55.988 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:56 np0005601977 nova_compute[183130]: 2026-01-30 09:23:56.002 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:56 np0005601977 nova_compute[183130]: 2026-01-30 09:23:56.021 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:23:56 np0005601977 nova_compute[183130]: 2026-01-30 09:23:56.022 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:56 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:56.288 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 30 Jan 2026 09:23:55 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-824ab5b4-653b-482d-9ad9-2db55499a1d1 x-openstack-request-id: req-824ab5b4-653b-482d-9ad9-2db55499a1d1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 30 04:23:56 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:56.289 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "43faf4bc-65eb-437f-b3dc-707ebe898840", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}]}, {"id": "bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 30 04:23:56 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:56.289 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-824ab5b4-653b-482d-9ad9-2db55499a1d1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 30 04:23:56 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:56.291 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b741677902cdc34801a38b7658be191fd284b6abc977e6eabd1767165f54f9cc" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 30 04:23:57 np0005601977 nova_compute[183130]: 2026-01-30 09:23:57.330 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.333 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 30 Jan 2026 09:23:56 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-dee71723-0c68-4c45-9aaf-b8add43c7674 x-openstack-request-id: req-dee71723-0c68-4c45-9aaf-b8add43c7674 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.333 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "43faf4bc-65eb-437f-b3dc-707ebe898840", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.333 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840 used request id req-dee71723-0c68-4c45-9aaf-b8add43c7674 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.334 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '43ceb724-51a7-4484-b588-85747155f2fe', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'hostId': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.336 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '396e2944b44f42e59b102db87e2e060c', 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'hostId': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.339 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 43ceb724-51a7-4484-b588-85747155f2fe / tapb77bd158-fa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.339 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.341 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 / tapcc981c4c-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.342 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b30d929d-5604-4a85-8705-b84cd6fbf2fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.336432', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '675ecea4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '9ac78290751a49597332a7f65eac6299aaecf83871fdd0da53191af814145d32'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.336432', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '675f33da-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '6d2331f2390a14552eeeec142ca590c0fbd30fc93398b70b65142c7e2671c0b4'}]}, 'timestamp': '2026-01-30 09:23:57.342637', '_unique_id': '4f3d29bd65f345b0b07d5cb1f55382a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.346 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>]
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.349 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70ca6278-d4a7-4f7d-8be2-127b0fd551a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.349630', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '67605314-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '41d0fa93ecc4b9cfa2f6ab0b63a5672ef7165278b7db02fcd816da7fee6c42d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.349630', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '67605c7e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': 'bd807ba52f8b63717810239cbdb94050ca699a8982bcfc8490bea9b94b3aa352'}]}, 'timestamp': '2026-01-30 09:23:57.350115', '_unique_id': '97e3a31c4858414588742864ef1bc8cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.350 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.351 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.351 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '223d6a5a-8565-4eb8-80fa-d8a565ac3fb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.351368', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '676096da-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '0f5f8d8515fc01522a98954529c2c35b89c0f43c9a3976664b6eef2a0a3e8832'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.351368', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '67609ef0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '46f41c75486d23a96ee3de7f3cd7cc4701757faefd8c5a52b2f93e289b61462b'}]}, 'timestamp': '2026-01-30 09:23:57.351809', '_unique_id': 'ef999542fa344453843def692f3141ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:23:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:57.376 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:57.377 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:23:57.378 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.381 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.bytes volume: 31025664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.382 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.406 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.bytes volume: 30992896 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.407 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69d3bf08-87b3-41d2-afc0-98b7169f0317', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31025664, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.352898', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67653a0a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '24d71ee52c8f90bb2a90d6522969db74da54846d2a76d454485b4277f346b318'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.352898', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '676548f6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '6c5fbf06b81f57f7e298e508fe49c237dd7f1c71e378aa887ec155cd08212dbb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30992896, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.352898', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67691260-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '97143d40d8709ca83167fe398061210de29a94221265bc213b07773713199375'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.352898', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67692048-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '03949670d223fbea685aa42076555b448106f7a866753b4a7170373ac1d05ec4'}]}, 'timestamp': '2026-01-30 09:23:57.407599', '_unique_id': '640241baf75149988983bb2432164358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.408 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.409 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.422 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/memory.usage volume: 42.546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.437 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/memory.usage volume: 46.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fde2b59d-cbb3-45a7-a4d0-69e13c10eb36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.546875, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'timestamp': '2026-01-30T09:23:57.409538', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '676b7ff0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.817418032, 'message_signature': '92f404263533bb769403f319fb5050070117a11d9eee79eb7ad708408f3888fa'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.32421875, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'timestamp': '2026-01-30T09:23:57.409538', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '676dc7ec-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.832311935, 'message_signature': '465cafe005c5e969ca5be9e0732be2debeef11dd9ce80d7c0e939269096f4cd7'}]}, 'timestamp': '2026-01-30 09:23:57.438306', '_unique_id': 'dddb4420435a4d06ae2e57dc1040aa24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.439 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.453 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.453 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.463 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.464 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8545a868-d5d7-4697-90d2-27438c0c96b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.441665', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '677029d8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': '1b85d51ae0c71d514c75943b2ac5918cb567eed1a8e0dfee0488d0337665529c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.441665', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67703b3a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': '9bbbc590acbd42d5658fba0d0cbdf170d32e64aed4c6715d6d30f0a0cea33492'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.441665', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6771cc20-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '1377b1611251a30acdd43781fca6404b014a18fef111070d5b4bf7bebc2b052b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.441665', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6771d986-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '08d02558fa03d5ac2864e94e5e032a15d7780e9c801ed52cdd3f598804efe98c'}]}, 'timestamp': '2026-01-30 09:23:57.464823', '_unique_id': 'bb426f58315446cfa32fa6369bd269c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.465 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.467 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.467 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.bytes volume: 72941568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.467 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.468 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.468 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079b4eb6-9261-49e8-b0af-6d512e4a23e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72941568, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.467201', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6772472c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': 'da6df1682ec6ecb6b098e355f50bfb31692e0d17b233193a7cd24ac4c824e294'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.467201', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67725730-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '2e5fcd258e392086ad9b2a362a64c00710a8aa7c9575e0582d89fce322c3b9b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.467201', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67726770-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '66500f072a7609cb58de464a551e7dbcedb62d90b7fef027320cba6be128bd55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.467201', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '677276ca-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '1d96dd82512fc959642c4dc90fcf7ecf988408c07ca1e62c2e37c1a3f286d5b8'}]}, 'timestamp': '2026-01-30 09:23:57.468843', '_unique_id': 'cd6e4478298f4c75b44a50d64f252fc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.471 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.outgoing.bytes volume: 5314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.471 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.outgoing.bytes volume: 5882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b28f1f3-2532-450e-a730-abcbf211620e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5314, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.471107', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6772e1fa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '1afe4f8904d35c07570082271fc7f785a6cb0b57b1383385b2da4ecbd0f5046e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5882, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.471107', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '6772f2ee-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '661b16c56ba006bbd81f4e7a58cb3db466d3ee8181d73c6d37c727defd34e28f'}]}, 'timestamp': '2026-01-30 09:23:57.472047', '_unique_id': '0d9100a5c6a0417a8c415093e062ec4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.472 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.474 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.474 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>]
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.474 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.outgoing.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.475 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.outgoing.packets volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '598324aa-06fc-48db-9787-fbe0e0c3ef46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.474894', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6773744e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '690f2ff123c72c7f5a8e7ae6f68296c6442835b457d5ca0dca9aa5fe12efbf50'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 45, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.474894', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '677386aa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '7987202b368f07d9d71a6023d9c96a2524b8085203e9c0c2dfa8bef2616daf53'}]}, 'timestamp': '2026-01-30 09:23:57.475834', '_unique_id': 'ff35178d533744d9a704d56ebc358a30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.477 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.478 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.478 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.478 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.479 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65b245b9-a07d-4fce-8cc9-ecda848825e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.478002', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6773ee4c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': '4a6f82e9c1f868cbf9d31025e609945eba44a83ac19b173aab4b6572ff813fb7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.478002', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6773ff68-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': 'ca71349206584e37fa8b0c6c218f3548e66224b5cd7d82ac1843f49e8446178c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.478002', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67740f4e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '674222e6eced556d0d90a3deb15ecb7e3992820430896014a253322a90049b34'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.478002', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6774202e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '20bc0b331fd1d52ec617d95dbfa45f448642b9acc73e53ff4d83b3eae273af4f'}]}, 'timestamp': '2026-01-30 09:23:57.479742', '_unique_id': '9f9541bf86d94feb89888b10c2357f33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.481 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.482 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.482 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.482 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.483 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b054a543-ba19-4e74-a262-83d5d07a0611', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.481969', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67748956-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': '97b563c4195a0863a04fcbde6ace8e8afb224eba45fdf57d1076f7207df83fbf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.481969', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67749ac2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.836865365, 'message_signature': '8194941be95449db54e37a745897837045968ef4f601e98069582330b3206748'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.481969', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6774aaa8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '074c5812f4772d9b294294bb3dc863e5c2b2e976e14bbae6f9dadd6de44aeb65'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.481969', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6774bb4c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.849118874, 'message_signature': '6a814a85bf20bc25551fae45f58fbc23fd479f93e881edd2feea812829c5bb8a'}]}, 'timestamp': '2026-01-30 09:23:57.483712', '_unique_id': 'e593e9869a294059820d7115cf5d09f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.484 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.486 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.486 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.487 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.requests volume: 1114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.487 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '369841ad-2cc1-4e25-8240-4971b84e0762', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.486342', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67753414-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '0bab6f7b544e885c56e835295f5d8a133ce4fc4c5f8e06194220f57bac35a540'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.486342', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67754418-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '3d620fcf28cd1f024897ba5bfd792437fe351ade388d94970d52ad461bbaef9f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1114, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.486342', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '677554b2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '6cbcb4424e7533e2df880a304f45e56cc98b1f54ced4ae188b1c1500a07e9bc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.486342', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '677563da-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': 'ad8625d2c3e152e421981ff8dc83a15ce7583936f8c8994f076a5d640fd3ea2e'}]}, 'timestamp': '2026-01-30 09:23:57.488026', '_unique_id': 'd931c953d1d641ac80bbf6216f8f5556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.488 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.490 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.incoming.bytes volume: 7090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.490 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.incoming.bytes volume: 7194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '247a7065-828a-4e5f-a03e-9f06372383b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7090, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.490225', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6775cb5e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': 'f538cc50600c8b74bdb4e8f548da7f5c1ee270488faa8d3ddb5c964819cd0902'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7194, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.490225', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '6775dd06-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '569308e5a11d51f4a839b0877d2fdea3d9264b25a257aca1bec28b7fdc5710d8'}]}, 'timestamp': '2026-01-30 09:23:57.491150', '_unique_id': '659c744d0c22406f8b84e6c5692ee356'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.493 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.493 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>]
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.493 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.494 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0cbc1ba-f0aa-4941-9a4a-9701e3075b77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.493865', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6776592a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '3050fd783350a82357eaa5bba7c6cfb8dd6c2797b20c25d97c156a1c13ec293f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.493865', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '67766c44-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '5477e60b11904ae138462cfda9391b2a28f077c676e2a298560ac44b7ef878c3'}]}, 'timestamp': '2026-01-30 09:23:57.494815', '_unique_id': '80e89041b9c046b5a0c6a4ed39b952b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.495 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.496 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.496 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/cpu volume: 11260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.497 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/cpu volume: 10810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '894be842-f4ca-4685-a06a-01fece9e8e36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11260000000, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'timestamp': '2026-01-30T09:23:57.496929', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6776d0d0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.817418032, 'message_signature': '21c55966bc9760575a619fab46fbb302c52d80ece8ba37851133c65da7bf2404'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10810000000, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'timestamp': '2026-01-30T09:23:57.496929', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6776e21e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.832311935, 'message_signature': '3047bdfc54af0258936abbf6498530bd2d36792be817bd9f846c948ee8ef1de7'}]}, 'timestamp': '2026-01-30 09:23:57.497813', '_unique_id': '24a4f48aa8544d0dad43e189ac4c9c76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.499 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.latency volume: 1094491520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.500 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.read.latency volume: 169389032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.500 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.latency volume: 872763087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.501 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.read.latency volume: 112463255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f2c4131-69b4-4c0a-a9e5-ced5311c538b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1094491520, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.499908', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '677745c4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': 'a3a7ace1bc93f364f5b4dc9ab7f7e81f8a029a6002ef7185f003d051aeb044d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 169389032, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.499908', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6777569a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '815b0754a1e0523784dfa615deff8307f251f2d8f5f57cab17240824449abcd4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 872763087, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.499908', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6777661c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '7c1f3450f361ef7936f63efb4a5136dfc81e82e47f9e56316eccc06b50977678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 112463255, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.499908', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '677776c0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': 'ed0fc3bdfbe307905fd2e6c5e2782a51310af75a1583c0d583d7ae566effd930'}]}, 'timestamp': '2026-01-30 09:23:57.501616', '_unique_id': '4adae1e0795b4aa28c6784038bbfa9c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.502 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.503 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.504 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7aa9f2d-f5f9-4a0c-a743-a526d26a54ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.503815', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6777ddea-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '1c1c92d644d7b62d8ecbcbb60828cdc45cbd2f47681dbae2c56f6e46fecbca37'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.503815', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '6777efec-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': 'ad13d2f956fa0d50bc639bb02a548b70c125307c511534a4cfc7ebd0f0bd1efa'}]}, 'timestamp': '2026-01-30 09:23:57.504737', '_unique_id': '19e5eb01d95044579d7b57fb36e4f475'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.505 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.506 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.requests volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.507 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.507 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.508 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79b01b7e-ad83-421d-a1b1-55faea5d58a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 300, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.506840', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '677853d8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '3b3e841793afe00674b2a551910300560d6f8626b7dcf31e4179b952828c778c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.506840', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '677864ea-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '9085c769aa1c5bda57892b5d6e56ca48a95c1274a37811a3177b895b6de47709'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.506840', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67787494-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': 'ef78ac4ce85e1e3890a1ffde23ab9fde7461d5e3c6175cafd255d2e7df8f91c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.506840', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '677884de-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '15ac14be348f8a118eb489ca5b48f1bc3cd4d0d6e2252fe33640308b00f5f7a1'}]}, 'timestamp': '2026-01-30 09:23:57.508532', '_unique_id': '9b256eb872a9412ba2f17068aa3227e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.510 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.incoming.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.511 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.incoming.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c74da18-bada-403e-962a-3fc700c20f93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.510727', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6778ebc2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': 'd7bf61f73472eca317d44e274801cbe3359202fc0f127b40b8684e7e0bb5a568'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.510727', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '6778f810-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': 'e36eb0914a6e0f04afe3e78ebedfd1f6b90caefc7b318682ea5c6cbe783290cf'}]}, 'timestamp': '2026-01-30 09:23:57.511427', '_unique_id': '71f485a16f1648c8af9b04a1bb86a629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.512 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1800673208>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659>]
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.513 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.latency volume: 3836829478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.513 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.513 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.latency volume: 2201435279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '905c7071-aba4-4d67-b652-e5808b50fd63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3836829478, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-vda', 'timestamp': '2026-01-30T09:23:57.513254', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67794b6c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': 'c607604fca54571b0363ec0ee0ad686e0483ca9735e7edc271edeeffd376b01e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '43ceb724-51a7-4484-b588-85747155f2fe-sda', 'timestamp': '2026-01-30T09:23:57.513254', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'instance-00000002', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67795648-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.747880533, 'message_signature': '5981d98c3be68cafe0cc712e04d5178f7a092141c4d055894023d6e139928ef9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2201435279, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-vda', 'timestamp': '2026-01-30T09:23:57.513254', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '677960de-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '803d6a0bedc57ea76ee4046d3c67b0e1697fe422c87cfcc995ef5dea2b257cfd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-sda', 'timestamp': '2026-01-30T09:23:57.513254', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'instance-00000003', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67796bec-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.777312341, 'message_signature': '7ba11175fc371841b4e57d4271f077d75474d4648506d0013b540668489e708e'}]}, 'timestamp': '2026-01-30 09:23:57.514380', '_unique_id': '2efcdf26cb074e3699a48e408a5d231f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.514 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.515 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.515 12 DEBUG ceilometer.compute.pollsters [-] 43ceb724-51a7-4484-b588-85747155f2fe/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.515 12 DEBUG ceilometer.compute.pollsters [-] 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25d52cc5-04b4-4832-8b81-78a01c471027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000002-43ceb724-51a7-4484-b588-85747155f2fe-tapb77bd158-fa', 'timestamp': '2026-01-30T09:23:57.515552', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1800673208', 'name': 'tapb77bd158-fa', 'instance_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:3d:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb77bd158-fa'}, 'message_id': '6779a3be-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.731408465, 'message_signature': '530d5a06fe3357ff1452ae621e0d82ae7c56dcdb374eac381bf8609ac41abfe1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000003-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-tapcc981c4c-eb', 'timestamp': '2026-01-30T09:23:57.515552', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659', 'name': 'tapcc981c4c-eb', 'instance_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:8e:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc981c4c-eb'}, 'message_id': '6779ac06-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3568.734879193, 'message_signature': '4f86b6001cde0bfacecef8ac799e86b4ddc0084fe54d209f42e35c9896af2a70'}]}, 'timestamp': '2026-01-30 09:23:57.515987', '_unique_id': '857c80bfd28548e3927148e82698d85b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:23:57 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:23:57.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:23:58 np0005601977 nova_compute[183130]: 2026-01-30 09:23:58.214 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:58 np0005601977 nova_compute[183130]: 2026-01-30 09:23:58.215 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:58 np0005601977 nova_compute[183130]: 2026-01-30 09:23:58.215 183134 DEBUG nova.network.neutron [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:59 np0005601977 nova_compute[183130]: 2026-01-30 09:23:59.022 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:59 np0005601977 nova_compute[183130]: 2026-01-30 09:23:59.022 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:23:59 np0005601977 nova_compute[183130]: 2026-01-30 09:23:59.023 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:23:59 np0005601977 nova_compute[183130]: 2026-01-30 09:23:59.049 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:59 np0005601977 nova_compute[183130]: 2026-01-30 09:23:59.643 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.063 183134 DEBUG nova.network.neutron [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.097 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.099 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.099 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.099 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 43ceb724-51a7-4484-b588-85747155f2fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.300 183134 DEBUG nova.virt.libvirt.driver [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.301 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Creating file /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/11da3551992b40548219a04df5a86ec1.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.301 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/11da3551992b40548219a04df5a86ec1.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.730 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/11da3551992b40548219a04df5a86ec1.tmp" returned: 1 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.730 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/11da3551992b40548219a04df5a86ec1.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.731 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Creating directory /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.731 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.938 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:01 np0005601977 nova_compute[183130]: 2026-01-30 09:24:01.942 183134 DEBUG nova.virt.libvirt.driver [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 30 04:24:02 np0005601977 nova_compute[183130]: 2026-01-30 09:24:02.334 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:02 np0005601977 podman[212410]: 2026-01-30 09:24:02.87297073 +0000 UTC m=+0.083915871 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.041 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.061 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.061 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.062 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.063 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.063 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.063 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.064 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.064 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:03 np0005601977 nova_compute[183130]: 2026-01-30 09:24:03.065 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:24:04 np0005601977 kernel: tapb77bd158-fa (unregistering): left promiscuous mode
Jan 30 04:24:04 np0005601977 NetworkManager[55565]: <info>  [1769765044.1305] device (tapb77bd158-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.137 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:24:04Z|00041|binding|INFO|Releasing lport b77bd158-fad4-4c68-8373-a447c84d330b from this chassis (sb_readonly=0)
Jan 30 04:24:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:24:04Z|00042|binding|INFO|Setting lport b77bd158-fad4-4c68-8373-a447c84d330b down in Southbound
Jan 30 04:24:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:24:04Z|00043|binding|INFO|Removing iface tapb77bd158-fa ovn-installed in OVS
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.159 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.164 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:3d:87 10.100.0.9'], port_security=['fa:16:3e:c4:3d:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '43ceb724-51a7-4484-b588-85747155f2fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '791b8d70-6f70-41a1-adc4-21257a54b33a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15b22b09-1a47-4869-84dd-49a1c18840ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b77bd158-fad4-4c68-8373-a447c84d330b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.166 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b77bd158-fad4-4c68-8373-a447c84d330b in datapath 7f7c04f6-63be-4d15-9767-329e1266cf2c unbound from our chassis#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.169 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f7c04f6-63be-4d15-9767-329e1266cf2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.171 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[259ef062-3299-4b32-9ca1-947796c1a213]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.171 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c namespace which is not needed anymore#033[00m
Jan 30 04:24:04 np0005601977 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 30 04:24:04 np0005601977 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 14.815s CPU time.
Jan 30 04:24:04 np0005601977 systemd-machined[154431]: Machine qemu-1-instance-00000002 terminated.
Jan 30 04:24:04 np0005601977 podman[212443]: 2026-01-30 09:24:04.214808319 +0000 UTC m=+0.063816692 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 30 04:24:04 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [NOTICE]   (211960) : haproxy version is 2.8.14-c23fe91
Jan 30 04:24:04 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [NOTICE]   (211960) : path to executable is /usr/sbin/haproxy
Jan 30 04:24:04 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [ALERT]    (211960) : Current worker (211966) exited with code 143 (Terminated)
Jan 30 04:24:04 np0005601977 neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c[211947]: [WARNING]  (211960) : All workers exited. Exiting... (0)
Jan 30 04:24:04 np0005601977 systemd[1]: libpod-f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03.scope: Deactivated successfully.
Jan 30 04:24:04 np0005601977 podman[212487]: 2026-01-30 09:24:04.288862301 +0000 UTC m=+0.042986521 container died f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:24:04 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03-userdata-shm.mount: Deactivated successfully.
Jan 30 04:24:04 np0005601977 systemd[1]: var-lib-containers-storage-overlay-a4b6a41df0d2fb996237288f1f591e584084ee5de9a88a81192ef09bbff5fa93-merged.mount: Deactivated successfully.
Jan 30 04:24:04 np0005601977 podman[212487]: 2026-01-30 09:24:04.320595271 +0000 UTC m=+0.074719491 container cleanup f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:24:04 np0005601977 systemd[1]: libpod-conmon-f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03.scope: Deactivated successfully.
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.367 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.370 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 podman[212520]: 2026-01-30 09:24:04.377243549 +0000 UTC m=+0.039486722 container remove f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.381 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[89a0474e-0e5a-4583-b298-602371512455]: (4, ('Fri Jan 30 09:24:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c (f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03)\nf34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03\nFri Jan 30 09:24:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c (f34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03)\nf34635d02d650d37072125a56246aeb5c3e985adf628d01e5c311be73bad7f03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.382 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e87ea2ac-7256-44b7-8bfd-d06816940620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.383 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f7c04f6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.385 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 kernel: tap7f7c04f6-60: left promiscuous mode
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.392 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.395 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4315e4af-c938-4b89-b05f-f267ad7b53e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.411 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[10c075d3-2ace-417f-a477-f1a56c16fbf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.412 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9f0de3-eda6-49e9-8e56-4b25996a4e3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.422 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f76c3411-b172-48d8-9787-90031781a05e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350964, 'reachable_time': 31543, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212555, 'error': None, 'target': 'ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.427 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7f7c04f6-63be-4d15-9767-329e1266cf2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:24:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:04.427 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[3521bace-1127-410e-868e-68f325ed794c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:24:04 np0005601977 systemd[1]: run-netns-ovnmeta\x2d7f7c04f6\x2d63be\x2d4d15\x2d9767\x2d329e1266cf2c.mount: Deactivated successfully.
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.645 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.960 183134 INFO nova.virt.libvirt.driver [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance shutdown successfully after 3 seconds.#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.965 183134 INFO nova.virt.libvirt.driver [-] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Instance destroyed successfully.#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.966 183134 DEBUG nova.virt.libvirt.vif [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1800673208',display_name='tempest-TestNetworkAdvancedServerOps-server-1800673208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1800673208',id=2,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBWMmOcexm/n60a0wV+ZCzqayVRcO1K6INCRcfXObsrkDu/ozQQv1ArDS1c2ehfKYVYO75n/FnjcB6h7Xs5xpJrmNoh4WV8d5imOfuVdju5GoBbfSHZMPiM0NrBf/MnozQ==',key_name='tempest-TestNetworkAdvancedServerOps-1994586608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:23:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8bukctap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:23:57Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=43ceb724-51a7-4484-b588-85747155f2fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--652188696", "vif_mac": "fa:16:3e:c4:3d:87"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.966 183134 DEBUG nova.network.os_vif_util [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Converting VIF {"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--652188696", "vif_mac": "fa:16:3e:c4:3d:87"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.967 183134 DEBUG nova.network.os_vif_util [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.967 183134 DEBUG os_vif [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.968 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:04 np0005601977 nova_compute[183130]: 2026-01-30 09:24:04.969 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb77bd158-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.007 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.009 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.011 183134 INFO os_vif [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa')#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.014 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.057 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.059 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.100 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.102 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk to 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.103 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.381 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.647 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.648 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.648 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.config 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.849 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -C -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.config 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.config" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.851 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:24:05 np0005601977 nova_compute[183130]: 2026-01-30 09:24:05.851 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.info 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.062 183134 DEBUG oslo_concurrency.processutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -C -r /var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe_resize/disk.info 192.168.122.102:/var/lib/nova/instances/43ceb724-51a7-4484-b588-85747155f2fe/disk.info" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.162 183134 DEBUG nova.compute.manager [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-unplugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.163 183134 DEBUG oslo_concurrency.lockutils [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.163 183134 DEBUG oslo_concurrency.lockutils [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.164 183134 DEBUG oslo_concurrency.lockutils [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.164 183134 DEBUG nova.compute.manager [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] No waiting events found dispatching network-vif-unplugged-b77bd158-fad4-4c68-8373-a447c84d330b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.164 183134 WARNING nova.compute.manager [req-41f21dad-4583-4bee-8c40-d0c24dace5ef req-c2a42514-591a-4119-a240-47dfb471a460 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received unexpected event network-vif-unplugged-b77bd158-fad4-4c68-8373-a447c84d330b for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.201 183134 DEBUG neutronclient.v2_0.client [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b77bd158-fad4-4c68-8373-a447c84d330b for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.285 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.286 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.293 183134 INFO nova.compute.rpcapi [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.293 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.307 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.308 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:06 np0005601977 nova_compute[183130]: 2026-01-30 09:24:06.308 183134 DEBUG oslo_concurrency.lockutils [None req-31bb010c-7433-441b-b7b9-0b0665437453 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:09 np0005601977 nova_compute[183130]: 2026-01-30 09:24:09.647 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.042 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.207 183134 DEBUG nova.compute.manager [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.208 183134 DEBUG oslo_concurrency.lockutils [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.208 183134 DEBUG oslo_concurrency.lockutils [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.208 183134 DEBUG oslo_concurrency.lockutils [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.209 183134 DEBUG nova.compute.manager [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] No waiting events found dispatching network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:24:10 np0005601977 nova_compute[183130]: 2026-01-30 09:24:10.209 183134 WARNING nova.compute.manager [req-642038e9-ad5d-4446-a7cc-9b14dc33eba7 req-0ad20fdc-6fdd-4b31-bf21-dae0454a24fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received unexpected event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:24:10 np0005601977 podman[212571]: 2026-01-30 09:24:10.834240814 +0000 UTC m=+0.051173713 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:24:10 np0005601977 podman[212572]: 2026-01-30 09:24:10.869965868 +0000 UTC m=+0.078997383 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:24:13 np0005601977 nova_compute[183130]: 2026-01-30 09:24:13.179 183134 DEBUG nova.compute.manager [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:24:13 np0005601977 nova_compute[183130]: 2026-01-30 09:24:13.180 183134 DEBUG nova.compute.manager [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing instance network info cache due to event network-changed-b77bd158-fad4-4c68-8373-a447c84d330b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:24:13 np0005601977 nova_compute[183130]: 2026-01-30 09:24:13.181 183134 DEBUG oslo_concurrency.lockutils [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:24:13 np0005601977 nova_compute[183130]: 2026-01-30 09:24:13.181 183134 DEBUG oslo_concurrency.lockutils [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:24:13 np0005601977 nova_compute[183130]: 2026-01-30 09:24:13.182 183134 DEBUG nova.network.neutron [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Refreshing network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:24:14 np0005601977 nova_compute[183130]: 2026-01-30 09:24:14.648 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:15 np0005601977 nova_compute[183130]: 2026-01-30 09:24:15.078 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:15 np0005601977 nova_compute[183130]: 2026-01-30 09:24:15.227 183134 DEBUG nova.network.neutron [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updated VIF entry in instance network info cache for port b77bd158-fad4-4c68-8373-a447c84d330b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:24:15 np0005601977 nova_compute[183130]: 2026-01-30 09:24:15.228 183134 DEBUG nova.network.neutron [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:24:15 np0005601977 nova_compute[183130]: 2026-01-30 09:24:15.249 183134 DEBUG oslo_concurrency.lockutils [req-32324ee5-f3c2-4feb-861f-cc39e895cff9 req-6b1e9c8d-4173-4c75-a017-604e0d7316a3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:24:16 np0005601977 nova_compute[183130]: 2026-01-30 09:24:16.387 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:16 np0005601977 nova_compute[183130]: 2026-01-30 09:24:16.387 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:16 np0005601977 nova_compute[183130]: 2026-01-30 09:24:16.388 183134 DEBUG nova.compute.manager [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 30 04:24:16 np0005601977 podman[212614]: 2026-01-30 09:24:16.954526074 +0000 UTC m=+0.162419170 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:24:17 np0005601977 nova_compute[183130]: 2026-01-30 09:24:17.077 183134 DEBUG neutronclient.v2_0.client [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b77bd158-fad4-4c68-8373-a447c84d330b for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 30 04:24:17 np0005601977 nova_compute[183130]: 2026-01-30 09:24:17.078 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:24:17 np0005601977 nova_compute[183130]: 2026-01-30 09:24:17.079 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:24:17 np0005601977 nova_compute[183130]: 2026-01-30 09:24:17.079 183134 DEBUG nova.network.neutron [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:24:17 np0005601977 nova_compute[183130]: 2026-01-30 09:24:17.080 183134 DEBUG nova.objects.instance [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'info_cache' on Instance uuid 43ceb724-51a7-4484-b588-85747155f2fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.679 183134 DEBUG nova.network.neutron [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Updating instance_info_cache with network_info: [{"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.743 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-43ceb724-51a7-4484-b588-85747155f2fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.743 183134 DEBUG nova.objects.instance [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 43ceb724-51a7-4484-b588-85747155f2fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.774 183134 DEBUG nova.virt.libvirt.host [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.775 183134 INFO nova.virt.libvirt.host [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] UEFI support detected#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.777 183134 DEBUG nova.virt.libvirt.vif [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1800673208',display_name='tempest-TestNetworkAdvancedServerOps-server-1800673208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1800673208',id=2,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBWMmOcexm/n60a0wV+ZCzqayVRcO1K6INCRcfXObsrkDu/ozQQv1ArDS1c2ehfKYVYO75n/FnjcB6h7Xs5xpJrmNoh4WV8d5imOfuVdju5GoBbfSHZMPiM0NrBf/MnozQ==',key_name='tempest-TestNetworkAdvancedServerOps-1994586608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:24:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8bukctap',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:24:13Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=43ceb724-51a7-4484-b588-85747155f2fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.777 183134 DEBUG nova.network.os_vif_util [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "b77bd158-fad4-4c68-8373-a447c84d330b", "address": "fa:16:3e:c4:3d:87", "network": {"id": "7f7c04f6-63be-4d15-9767-329e1266cf2c", "bridge": "br-int", "label": "tempest-network-smoke--652188696", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb77bd158-fa", "ovs_interfaceid": "b77bd158-fad4-4c68-8373-a447c84d330b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.778 183134 DEBUG nova.network.os_vif_util [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.778 183134 DEBUG os_vif [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.780 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.780 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb77bd158-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.781 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.783 183134 INFO os_vif [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:3d:87,bridge_name='br-int',has_traffic_filtering=True,id=b77bd158-fad4-4c68-8373-a447c84d330b,network=Network(7f7c04f6-63be-4d15-9767-329e1266cf2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb77bd158-fa')#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.783 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.783 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.897 183134 DEBUG nova.compute.provider_tree [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.927 183134 DEBUG nova.scheduler.client.report [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.967 183134 DEBUG nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.968 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.968 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.969 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.969 183134 DEBUG nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] No waiting events found dispatching network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.969 183134 WARNING nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received unexpected event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b for instance with vm_state resized and task_state None.#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.970 183134 DEBUG nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.970 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "43ceb724-51a7-4484-b588-85747155f2fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.970 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.971 183134 DEBUG oslo_concurrency.lockutils [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.971 183134 DEBUG nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] No waiting events found dispatching network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.971 183134 WARNING nova.compute.manager [req-4c3e6df1-f04f-4451-901f-452d1bb86968 req-8bd8a3ff-e721-48bc-af7b-a15e228d3004 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Received unexpected event network-vif-plugged-b77bd158-fad4-4c68-8373-a447c84d330b for instance with vm_state resized and task_state None.#033[00m
Jan 30 04:24:18 np0005601977 nova_compute[183130]: 2026-01-30 09:24:18.998 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.323 183134 INFO nova.scheduler.client.report [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocation for migration 9bb9d894-ae85-443f-a27f-16301b77e220#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.397 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765044.3967037, 43ceb724-51a7-4484-b588-85747155f2fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.397 183134 INFO nova.compute.manager [-] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.448 183134 DEBUG oslo_concurrency.lockutils [None req-b44ac882-da93-49c9-bb52-af54b109301b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "43ceb724-51a7-4484-b588-85747155f2fe" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.455 183134 DEBUG nova.compute.manager [None req-907aa087-aff3-4c88-b89f-972cb0695fa7 - - - - - -] [instance: 43ceb724-51a7-4484-b588-85747155f2fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:24:19 np0005601977 nova_compute[183130]: 2026-01-30 09:24:19.651 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:20 np0005601977 nova_compute[183130]: 2026-01-30 09:24:20.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:23 np0005601977 podman[212641]: 2026-01-30 09:24:23.851557046 +0000 UTC m=+0.065479229 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:24:23 np0005601977 nova_compute[183130]: 2026-01-30 09:24:23.901 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Creating tmpfile /var/lib/nova/instances/tmpmvpj_v18 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 30 04:24:24 np0005601977 nova_compute[183130]: 2026-01-30 09:24:24.330 183134 DEBUG nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvpj_v18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 30 04:24:24 np0005601977 nova_compute[183130]: 2026-01-30 09:24:24.653 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:24.669 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:24:24 np0005601977 nova_compute[183130]: 2026-01-30 09:24:24.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:24.670 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:24:25 np0005601977 nova_compute[183130]: 2026-01-30 09:24:25.120 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:27.672 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:29 np0005601977 nova_compute[183130]: 2026-01-30 09:24:29.656 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:30 np0005601977 nova_compute[183130]: 2026-01-30 09:24:30.157 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:33 np0005601977 podman[212666]: 2026-01-30 09:24:33.824540378 +0000 UTC m=+0.047120618 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 30 04:24:34 np0005601977 nova_compute[183130]: 2026-01-30 09:24:34.658 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:34 np0005601977 podman[212689]: 2026-01-30 09:24:34.825071581 +0000 UTC m=+0.047170519 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:24:35 np0005601977 nova_compute[183130]: 2026-01-30 09:24:35.159 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:39 np0005601977 nova_compute[183130]: 2026-01-30 09:24:39.660 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:40 np0005601977 nova_compute[183130]: 2026-01-30 09:24:40.161 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:41 np0005601977 podman[212712]: 2026-01-30 09:24:41.863690142 +0000 UTC m=+0.074285299 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:24:41 np0005601977 podman[212711]: 2026-01-30 09:24:41.878345278 +0000 UTC m=+0.093587497 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:24:44 np0005601977 nova_compute[183130]: 2026-01-30 09:24:44.661 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:45 np0005601977 nova_compute[183130]: 2026-01-30 09:24:45.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:47 np0005601977 podman[212754]: 2026-01-30 09:24:47.878419076 +0000 UTC m=+0.089451170 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 30 04:24:49 np0005601977 nova_compute[183130]: 2026-01-30 09:24:49.663 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:50 np0005601977 nova_compute[183130]: 2026-01-30 09:24:50.166 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:54 np0005601977 nova_compute[183130]: 2026-01-30 09:24:54.664 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:54 np0005601977 podman[212780]: 2026-01-30 09:24:54.835829322 +0000 UTC m=+0.055254340 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.168 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.380 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.381 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.381 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.381 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.455 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.514 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.515 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.573 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.766 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.767 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5573MB free_disk=73.33472061157227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.768 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.768 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.847 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.868 183134 WARNING nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.868 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.869 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.932 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.948 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.980 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:24:55 np0005601977 nova_compute[183130]: 2026-01-30 09:24:55.981 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:56 np0005601977 nova_compute[183130]: 2026-01-30 09:24:56.976 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:57.378 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:57.378 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:24:57.379 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:59 np0005601977 nova_compute[183130]: 2026-01-30 09:24:59.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:59 np0005601977 nova_compute[183130]: 2026-01-30 09:24:59.666 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:00 np0005601977 nova_compute[183130]: 2026-01-30 09:25:00.171 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:00 np0005601977 nova_compute[183130]: 2026-01-30 09:25:00.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:00 np0005601977 nova_compute[183130]: 2026-01-30 09:25:00.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:25:01 np0005601977 nova_compute[183130]: 2026-01-30 09:25:01.079 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:01 np0005601977 nova_compute[183130]: 2026-01-30 09:25:01.080 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:01 np0005601977 nova_compute[183130]: 2026-01-30 09:25:01.080 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.456 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [{"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.472 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.472 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.473 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.473 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.474 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.474 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:02 np0005601977 nova_compute[183130]: 2026-01-30 09:25:02.474 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:25:04 np0005601977 nova_compute[183130]: 2026-01-30 09:25:04.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:04 np0005601977 nova_compute[183130]: 2026-01-30 09:25:04.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:04 np0005601977 nova_compute[183130]: 2026-01-30 09:25:04.667 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:04 np0005601977 podman[212812]: 2026-01-30 09:25:04.869321648 +0000 UTC m=+0.078400903 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Jan 30 04:25:04 np0005601977 podman[212835]: 2026-01-30 09:25:04.97069875 +0000 UTC m=+0.063950513 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:25:05 np0005601977 nova_compute[183130]: 2026-01-30 09:25:05.174 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:09 np0005601977 nova_compute[183130]: 2026-01-30 09:25:09.669 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:10 np0005601977 nova_compute[183130]: 2026-01-30 09:25:10.228 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:12 np0005601977 podman[212858]: 2026-01-30 09:25:12.855879258 +0000 UTC m=+0.061750899 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:25:12 np0005601977 podman[212859]: 2026-01-30 09:25:12.863937453 +0000 UTC m=+0.067352482 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:25:14 np0005601977 nova_compute[183130]: 2026-01-30 09:25:14.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:15Z|00044|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Jan 30 04:25:15 np0005601977 nova_compute[183130]: 2026-01-30 09:25:15.278 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:18 np0005601977 podman[212902]: 2026-01-30 09:25:18.861137764 +0000 UTC m=+0.076972822 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:25:19 np0005601977 nova_compute[183130]: 2026-01-30 09:25:19.673 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:20 np0005601977 nova_compute[183130]: 2026-01-30 09:25:20.279 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.324 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.325 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.357 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.478 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.478 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.486 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.486 183134 INFO nova.compute.claims [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.691 183134 DEBUG nova.compute.provider_tree [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.707 183134 DEBUG nova.scheduler.client.report [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.741 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.742 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.793 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.793 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.834 183134 INFO nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.853 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.946 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.948 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.949 183134 INFO nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Creating image(s)#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.949 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.950 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.951 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:22 np0005601977 nova_compute[183130]: 2026-01-30 09:25:22.968 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.036 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.037 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.038 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.056 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.113 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.114 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.149 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.150 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.151 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.225 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.226 183134 DEBUG nova.virt.disk.api [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.227 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.262 183134 DEBUG nova.policy [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.288 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.289 183134 DEBUG nova.virt.disk.api [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.289 183134 DEBUG nova.objects.instance [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 11292039-d151-44b8-87a9-a58bbc82deaa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.312 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.312 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Ensure instance console log exists: /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.313 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.313 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:23 np0005601977 nova_compute[183130]: 2026-01-30 09:25:23.314 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:24 np0005601977 nova_compute[183130]: 2026-01-30 09:25:24.675 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:24 np0005601977 nova_compute[183130]: 2026-01-30 09:25:24.745 183134 DEBUG nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvpj_v18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7a073e24-c800-4962-af5e-ff5400800f34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 30 04:25:24 np0005601977 nova_compute[183130]: 2026-01-30 09:25:24.786 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:24 np0005601977 nova_compute[183130]: 2026-01-30 09:25:24.786 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquired lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:24 np0005601977 nova_compute[183130]: 2026-01-30 09:25:24.786 183134 DEBUG nova.network.neutron [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:25:25 np0005601977 nova_compute[183130]: 2026-01-30 09:25:25.281 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:25.459 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:25.461 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:25:25 np0005601977 nova_compute[183130]: 2026-01-30 09:25:25.494 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:25 np0005601977 podman[212943]: 2026-01-30 09:25:25.844990972 +0000 UTC m=+0.059371219 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:25:27 np0005601977 nova_compute[183130]: 2026-01-30 09:25:27.033 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Successfully created port: 72592aff-3a8e-4d04-a6b8-d59e2c43fade _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:25:28 np0005601977 nova_compute[183130]: 2026-01-30 09:25:28.983 183134 DEBUG nova.network.neutron [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [{"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.586 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Releasing lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.589 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvpj_v18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7a073e24-c800-4962-af5e-ff5400800f34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.590 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Creating instance directory: /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.590 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Creating disk.info with the contents: {'/var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk': 'qcow2', '/var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.591 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.592 183134 DEBUG nova.objects.instance [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7a073e24-c800-4962-af5e-ff5400800f34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.637 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.677 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.701 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.702 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.702 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.719 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.788 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.789 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.816 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.817 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.817 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.859 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.861 183134 DEBUG nova.virt.disk.api [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Checking if we can resize image /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.862 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.915 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.916 183134 DEBUG nova.virt.disk.api [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Cannot resize image /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:25:29 np0005601977 nova_compute[183130]: 2026-01-30 09:25:29.917 183134 DEBUG nova.objects.instance [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a073e24-c800-4962-af5e-ff5400800f34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.127 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.149 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.151 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config to /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.152 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.283 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.401 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.402 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.432 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.516 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.517 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.525 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.525 183134 INFO nova.compute.claims [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.538 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Successfully updated port: 72592aff-3a8e-4d04-a6b8-d59e2c43fade _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.568 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.569 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.569 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.721 183134 DEBUG oslo_concurrency.processutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk.config /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.723 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.724 183134 DEBUG nova.virt.libvirt.vif [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:23:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1403442336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1403442336',id=6,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:24:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-50if40mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:24:19Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=7a073e24-c800-4962-af5e-ff5400800f34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.725 183134 DEBUG nova.network.os_vif_util [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converting VIF {"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.726 183134 DEBUG nova.network.os_vif_util [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.726 183134 DEBUG os_vif [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.731 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.731 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.732 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.737 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.738 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb902761-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.739 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb902761-f0, col_values=(('external_ids', {'iface-id': 'fb902761-f001-4e8a-9c56-1bdc4fb6a88e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:52:dd', 'vm-uuid': '7a073e24-c800-4962-af5e-ff5400800f34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.791 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:30 np0005601977 NetworkManager[55565]: <info>  [1769765130.7929] manager: (tapfb902761-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.796 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.800 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.801 183134 INFO os_vif [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0')#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.802 183134 DEBUG nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.802 183134 DEBUG nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvpj_v18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7a073e24-c800-4962-af5e-ff5400800f34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.809 183134 DEBUG nova.compute.provider_tree [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.834 183134 DEBUG nova.scheduler.client.report [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.858 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.859 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.922 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.923 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.942 183134 INFO nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:25:30 np0005601977 nova_compute[183130]: 2026-01-30 09:25:30.970 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.052 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.053 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.054 183134 INFO nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Creating image(s)#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.054 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.055 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.055 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.069 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.123 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.124 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.125 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.135 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.176 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.201 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.202 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.284 183134 DEBUG nova.policy [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.378 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk 1073741824" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.380 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.380 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.438 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.439 183134 DEBUG nova.virt.disk.api [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Checking if we can resize image /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.440 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.501 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.503 183134 DEBUG nova.virt.disk.api [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Cannot resize image /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.503 183134 DEBUG nova.objects.instance [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lazy-loading 'migration_context' on Instance uuid 9c98ea59-db8f-40da-830b-351a58e44561 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.534 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.535 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Ensure instance console log exists: /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.535 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.536 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:31 np0005601977 nova_compute[183130]: 2026-01-30 09:25:31.536 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.244 183134 DEBUG nova.network.neutron [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Port fb902761-f001-4e8a-9c56-1bdc4fb6a88e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.246 183134 DEBUG nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvpj_v18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7a073e24-c800-4962-af5e-ff5400800f34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.399 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Successfully created port: 67ee4400-6557-46b1-b66a-75f59eee46ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:25:33 np0005601977 systemd[1]: Starting libvirt proxy daemon...
Jan 30 04:25:33 np0005601977 systemd[1]: Started libvirt proxy daemon.
Jan 30 04:25:33 np0005601977 kernel: tapfb902761-f0: entered promiscuous mode
Jan 30 04:25:33 np0005601977 NetworkManager[55565]: <info>  [1769765133.5874] manager: (tapfb902761-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 30 04:25:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:33Z|00045|binding|INFO|Claiming lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e for this additional chassis.
Jan 30 04:25:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:33Z|00046|binding|INFO|fb902761-f001-4e8a-9c56-1bdc4fb6a88e: Claiming fa:16:3e:9b:52:dd 10.100.0.3
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.589 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.591 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:33Z|00047|binding|INFO|Setting lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e ovn-installed in OVS
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.594 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:33 np0005601977 systemd-machined[154431]: New machine qemu-3-instance-00000006.
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.627 183134 DEBUG nova.compute.manager [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.627 183134 DEBUG nova.compute.manager [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing instance network info cache due to event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:25:33 np0005601977 nova_compute[183130]: 2026-01-30 09:25:33.628 183134 DEBUG oslo_concurrency.lockutils [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:33 np0005601977 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Jan 30 04:25:33 np0005601977 systemd-udevd[213038]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:25:33 np0005601977 NetworkManager[55565]: <info>  [1769765133.6612] device (tapfb902761-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:25:33 np0005601977 NetworkManager[55565]: <info>  [1769765133.6629] device (tapfb902761-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.512 183134 DEBUG nova.network.neutron [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.617 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.617 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Instance network_info: |[{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.618 183134 DEBUG oslo_concurrency.lockutils [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.618 183134 DEBUG nova.network.neutron [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.620 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Start _get_guest_xml network_info=[{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.625 183134 WARNING nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.638 183134 DEBUG nova.virt.libvirt.host [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.639 183134 DEBUG nova.virt.libvirt.host [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.641 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765134.6410785, 7a073e24-c800-4962-af5e-ff5400800f34 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.642 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] VM Started (Lifecycle Event)#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.645 183134 DEBUG nova.virt.libvirt.host [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.646 183134 DEBUG nova.virt.libvirt.host [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.648 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.648 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.649 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.649 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.650 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.650 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.650 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.651 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.651 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.652 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.652 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.652 183134 DEBUG nova.virt.hardware [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.657 183134 DEBUG nova.virt.libvirt.vif [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:25:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2130032467',display_name='tempest-TestGettingAddress-server-2130032467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2130032467',id=9,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL+BySKS7cSsEJM1gGpSuc/gl0kYfJFv54Hi5jUq0ai9Z4VaR6LjUWtPODzZlmSBuVVJSx3dL7rZ83jUQgropZ6wwOTftLSgNiaOJW3HXwFNeQZ0eUotqoI6Bi1PJ9yeCw==',key_name='tempest-TestGettingAddress-1319690601',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-fklua2y8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:25:22Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=11292039-d151-44b8-87a9-a58bbc82deaa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.657 183134 DEBUG nova.network.os_vif_util [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.658 183134 DEBUG nova.network.os_vif_util [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.659 183134 DEBUG nova.objects.instance [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 11292039-d151-44b8-87a9-a58bbc82deaa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.679 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.695 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.735 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <uuid>11292039-d151-44b8-87a9-a58bbc82deaa</uuid>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <name>instance-00000009</name>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-2130032467</nova:name>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:25:34</nova:creationTime>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        <nova:port uuid="72592aff-3a8e-4d04-a6b8-d59e2c43fade">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe74:c68c" ipVersion="6"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="serial">11292039-d151-44b8-87a9-a58bbc82deaa</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="uuid">11292039-d151-44b8-87a9-a58bbc82deaa</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.config"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:74:c6:8c"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <target dev="tap72592aff-3a"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/console.log" append="off"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:25:34 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:25:34 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:25:34 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:25:34 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.737 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Preparing to wait for external event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.737 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.738 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.738 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.738 183134 DEBUG nova.virt.libvirt.vif [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:25:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2130032467',display_name='tempest-TestGettingAddress-server-2130032467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2130032467',id=9,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL+BySKS7cSsEJM1gGpSuc/gl0kYfJFv54Hi5jUq0ai9Z4VaR6LjUWtPODzZlmSBuVVJSx3dL7rZ83jUQgropZ6wwOTftLSgNiaOJW3HXwFNeQZ0eUotqoI6Bi1PJ9yeCw==',key_name='tempest-TestGettingAddress-1319690601',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-fklua2y8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:25:22Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=11292039-d151-44b8-87a9-a58bbc82deaa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.739 183134 DEBUG nova.network.os_vif_util [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.739 183134 DEBUG nova.network.os_vif_util [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.740 183134 DEBUG os_vif [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.740 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.741 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.741 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.744 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.744 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72592aff-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.745 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72592aff-3a, col_values=(('external_ids', {'iface-id': '72592aff-3a8e-4d04-a6b8-d59e2c43fade', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:c6:8c', 'vm-uuid': '11292039-d151-44b8-87a9-a58bbc82deaa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.746 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:34 np0005601977 NetworkManager[55565]: <info>  [1769765134.7476] manager: (tap72592aff-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.749 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.751 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.753 183134 INFO os_vif [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a')#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.820 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.821 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.821 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:74:c6:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:25:34 np0005601977 nova_compute[183130]: 2026-01-30 09:25:34.821 183134 INFO nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Using config drive#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.423 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765135.42339, 7a073e24-c800-4962-af5e-ff5400800f34 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.425 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.433 183134 INFO nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Creating config drive at /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.config#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.437 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44j9891m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.453 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.456 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.462 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.492 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.554 183134 DEBUG oslo_concurrency.processutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44j9891m" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:35 np0005601977 kernel: tap72592aff-3a: entered promiscuous mode
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.608 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.612 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.6172] manager: (tap72592aff-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.617 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:35Z|00048|binding|INFO|Claiming lport 72592aff-3a8e-4d04-a6b8-d59e2c43fade for this chassis.
Jan 30 04:25:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:35Z|00049|binding|INFO|72592aff-3a8e-4d04-a6b8-d59e2c43fade: Claiming fa:16:3e:74:c6:8c 10.100.0.5 2001:db8::f816:3eff:fe74:c68c
Jan 30 04:25:35 np0005601977 systemd-udevd[213040]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:25:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:35Z|00050|binding|INFO|Setting lport 72592aff-3a8e-4d04-a6b8-d59e2c43fade ovn-installed in OVS
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.628 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:35Z|00051|binding|INFO|Setting lport 72592aff-3a8e-4d04-a6b8-d59e2c43fade up in Southbound
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.629 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.626 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:c6:8c 10.100.0.5 2001:db8::f816:3eff:fe74:c68c'], port_security=['fa:16:3e:74:c6:8c 10.100.0.5 2001:db8::f816:3eff:fe74:c68c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8::f816:3eff:fe74:c68c/64', 'neutron:device_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a66ac63e-87fa-44f1-8a47-4e4a2f3b85c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12e8e746-8e5c-4e29-a519-42408cc3b2d9, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=72592aff-3a8e-4d04-a6b8-d59e2c43fade) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.628 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 72592aff-3a8e-4d04-a6b8-d59e2c43fade in datapath 175868ce-3812-409c-871e-725dea7b3f30 bound to our chassis#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.630 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 175868ce-3812-409c-871e-725dea7b3f30#033[00m
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.6347] device (tap72592aff-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.6366] device (tap72592aff-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.640 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[045edfb3-9d3f-47ec-9ee3-2bdff8accf6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.641 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap175868ce-31 in ovnmeta-175868ce-3812-409c-871e-725dea7b3f30 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:25:35 np0005601977 systemd-machined[154431]: New machine qemu-4-instance-00000009.
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.642 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap175868ce-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.642 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6e933ce7-f13e-4626-ba03-ebd999da5f57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.643 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[903566b6-2615-4e5d-ad8c-bcef8ac88c99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.652 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[413e9305-8cc8-4e4b-944b-8134ae9f4d06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.662 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[18a4ac16-1e0b-4cd5-84b4-c85a1ff794a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.681 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a9db52e0-e984-42ee-ab23-1a3692bd997a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.686 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0fffcf-7156-4075-865d-b0df081b6d1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.6877] manager: (tap175868ce-30): new Veth device (/org/freedesktop/NetworkManager/Devices/36)
Jan 30 04:25:35 np0005601977 podman[213080]: 2026-01-30 09:25:35.688943391 +0000 UTC m=+0.066751905 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.710 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfe8b8e-f30d-4c1e-860f-3b6f19c646a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.713 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6073ac-3121-4c70-9483-10d6cd81d193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 podman[213079]: 2026-01-30 09:25:35.723028663 +0000 UTC m=+0.119242812 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.7309] device (tap175868ce-30): carrier: link connected
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.736 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[dd032ab9-cd9f-431c-9372-15a9dfb1ceee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.749 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aca56fde-91f8-4ef4-a56a-b518bf517ec9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap175868ce-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:b8:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366707, 'reachable_time': 21327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213154, 'error': None, 'target': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.761 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aa66e11c-fa92-4f23-b946-a52fa7ad08de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:b809'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366707, 'tstamp': 366707}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213156, 'error': None, 'target': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.773 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2a010de1-57e7-4369-8f84-60a31b570717]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap175868ce-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:b8:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366707, 'reachable_time': 21327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213157, 'error': None, 'target': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.800 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff9a547-4690-4627-a36b-5cd0b359f75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.837 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[35dd0adb-974d-4aaf-b314-1ebb066533ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.838 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap175868ce-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.839 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.839 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap175868ce-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:35 np0005601977 NetworkManager[55565]: <info>  [1769765135.8416] manager: (tap175868ce-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.840 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 kernel: tap175868ce-30: entered promiscuous mode
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.843 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.844 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap175868ce-30, col_values=(('external_ids', {'iface-id': '3fa91002-4287-41d0-8a0e-e00f676bd48b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.844 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:35Z|00052|binding|INFO|Releasing lport 3fa91002-4287-41d0-8a0e-e00f676bd48b from this chassis (sb_readonly=0)
Jan 30 04:25:35 np0005601977 nova_compute[183130]: 2026-01-30 09:25:35.853 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.854 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/175868ce-3812-409c-871e-725dea7b3f30.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/175868ce-3812-409c-871e-725dea7b3f30.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.856 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[626ed01e-db9c-4d09-aebc-456fa5c402d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.857 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-175868ce-3812-409c-871e-725dea7b3f30
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/175868ce-3812-409c-871e-725dea7b3f30.pid.haproxy
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 175868ce-3812-409c-871e-725dea7b3f30
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:25:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:35.857 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'env', 'PROCESS_TAG=haproxy-175868ce-3812-409c-871e-725dea7b3f30', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/175868ce-3812-409c-871e-725dea7b3f30.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:25:36 np0005601977 podman[213189]: 2026-01-30 09:25:36.176849285 +0000 UTC m=+0.043037304 container create f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:25:36 np0005601977 systemd[1]: Started libpod-conmon-f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd.scope.
Jan 30 04:25:36 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:25:36 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e43944d7c88807a591b646492688e4d2c47062cbef55e484f17fefb4d523aa8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:25:36 np0005601977 podman[213189]: 2026-01-30 09:25:36.156509383 +0000 UTC m=+0.022697412 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:25:36 np0005601977 podman[213189]: 2026-01-30 09:25:36.262558391 +0000 UTC m=+0.128746440 container init f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:25:36 np0005601977 podman[213189]: 2026-01-30 09:25:36.266552597 +0000 UTC m=+0.132740616 container start f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:25:36 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [NOTICE]   (213208) : New worker (213210) forked
Jan 30 04:25:36 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [NOTICE]   (213208) : Loading success.
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.555 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765136.5549252, 11292039-d151-44b8-87a9-a58bbc82deaa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.556 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] VM Started (Lifecycle Event)#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.589 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.593 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765136.5553956, 11292039-d151-44b8-87a9-a58bbc82deaa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.593 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.636 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.639 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:36 np0005601977 nova_compute[183130]: 2026-01-30 09:25:36.669 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:25:37 np0005601977 nova_compute[183130]: 2026-01-30 09:25:37.328 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Successfully updated port: 67ee4400-6557-46b1-b66a-75f59eee46ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:25:37 np0005601977 nova_compute[183130]: 2026-01-30 09:25:37.353 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:37 np0005601977 nova_compute[183130]: 2026-01-30 09:25:37.354 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquired lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:37 np0005601977 nova_compute[183130]: 2026-01-30 09:25:37.354 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.041 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00053|binding|INFO|Releasing lport 1e1a9288-6756-4ed5-a91d-8ad95ed4e3ae from this chassis (sb_readonly=0)
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00054|binding|INFO|Releasing lport 3fa91002-4287-41d0-8a0e-e00f676bd48b from this chassis (sb_readonly=0)
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.108 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00055|binding|INFO|Claiming lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e for this chassis.
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00056|binding|INFO|fb902761-f001-4e8a-9c56-1bdc4fb6a88e: Claiming fa:16:3e:9b:52:dd 10.100.0.3
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00057|binding|INFO|Setting lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e up in Southbound
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.376 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:52:dd 10.100.0.3'], port_security=['fa:16:3e:9b:52:dd 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=fb902761-f001-4e8a-9c56-1bdc4fb6a88e) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.377 104706 INFO neutron.agent.ovn.metadata.agent [-] Port fb902761-f001-4e8a-9c56-1bdc4fb6a88e in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 bound to our chassis#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.379 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.389 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ca7915-24a7-47ff-ab6e-488f6e6ee4ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.390 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e0e3ea2-51 in ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.391 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e0e3ea2-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.391 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[99c73374-d070-4db9-b51d-066d47c5da9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.393 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9b7edc-5fde-4aa6-8e98-462ddc9fd6aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.401 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[50404b78-1241-4167-a883-116f06b8618b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.425 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[87100967-f627-4c6f-a250-a8e37d608692]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.448 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[db48b699-65af-46b4-b8fb-157c9b9db636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.454 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5b425f1f-8fd9-4658-9541-ae9a1413d201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 NetworkManager[55565]: <info>  [1769765138.4553] manager: (tap8e0e3ea2-50): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.475 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8c741e45-a936-41f3-a7a0-b6d9c3a6c113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.478 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[707b2b34-ec93-4bab-9822-0fcfac83621f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 systemd-udevd[213233]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:25:38 np0005601977 NetworkManager[55565]: <info>  [1769765138.4956] device (tap8e0e3ea2-50): carrier: link connected
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.499 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[74cdca27-b611-41f5-8a57-0fea06b7a0c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.513 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ee43c94b-15de-4a7a-8b9b-97dbe2cbbcaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213252, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.527 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6e6a4a-cb55-4fc7-a5a9-b2c938bdb02b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:a647'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366983, 'tstamp': 366983}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213253, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.539 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[07081843-a9d0-4fcd-ab14-c7dc35fdd88f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213254, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.562 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[743e159b-4111-446d-9ca1-5a3d57722fa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.614 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[88df8ab0-2841-4531-a262-e20f7a78baa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.616 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.616 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.617 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e3ea2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:38 np0005601977 NetworkManager[55565]: <info>  [1769765138.6196] manager: (tap8e0e3ea2-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.618 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 kernel: tap8e0e3ea2-50: entered promiscuous mode
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.623 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e3ea2-50, col_values=(('external_ids', {'iface-id': '15b4d9a6-bad1-4bf8-a262-02e27eb8ea93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:38Z|00058|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.625 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e3ea2-5897-4c05-8f15-ccf8330993c7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e3ea2-5897-4c05-8f15-ccf8330993c7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.626 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[02658807-eda5-4f29-adc7-9dba54965241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.626 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-8e0e3ea2-5897-4c05-8f15-ccf8330993c7
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/8e0e3ea2-5897-4c05-8f15-ccf8330993c7.pid.haproxy
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 8e0e3ea2-5897-4c05-8f15-ccf8330993c7
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:25:38 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:38.628 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'env', 'PROCESS_TAG=haproxy-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e0e3ea2-5897-4c05-8f15-ccf8330993c7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.630 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:38 np0005601977 nova_compute[183130]: 2026-01-30 09:25:38.803 183134 INFO nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Post operation of migration started#033[00m
Jan 30 04:25:38 np0005601977 podman[213287]: 2026-01-30 09:25:38.927325684 +0000 UTC m=+0.044404594 container create 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:25:38 np0005601977 systemd[1]: Started libpod-conmon-8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b.scope.
Jan 30 04:25:38 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:25:38 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268efb99abd6e31584aa827875d86e9990f9ff6e50f730d802c5bdf2c3353f2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:25:38 np0005601977 podman[213287]: 2026-01-30 09:25:38.903894931 +0000 UTC m=+0.020973831 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:25:39 np0005601977 podman[213287]: 2026-01-30 09:25:39.004788269 +0000 UTC m=+0.121867229 container init 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:25:39 np0005601977 podman[213287]: 2026-01-30 09:25:39.009672241 +0000 UTC m=+0.126751151 container start 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 30 04:25:39 np0005601977 nova_compute[183130]: 2026-01-30 09:25:39.029 183134 DEBUG nova.network.neutron [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updated VIF entry in instance network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:25:39 np0005601977 nova_compute[183130]: 2026-01-30 09:25:39.030 183134 DEBUG nova.network.neutron [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:39 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [NOTICE]   (213306) : New worker (213308) forked
Jan 30 04:25:39 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [NOTICE]   (213306) : Loading success.
Jan 30 04:25:39 np0005601977 nova_compute[183130]: 2026-01-30 09:25:39.065 183134 DEBUG oslo_concurrency.lockutils [req-2ef50a48-0827-405a-b421-5d50a6da9706 req-12a8f5ed-ad6e-4da0-b71e-c829017989de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:39 np0005601977 nova_compute[183130]: 2026-01-30 09:25:39.680 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:39 np0005601977 nova_compute[183130]: 2026-01-30 09:25:39.747 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.222 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.223 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquired lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.224 183134 DEBUG nova.network.neutron [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.538 183134 DEBUG nova.network.neutron [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating instance_info_cache with network_info: [{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.586 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Releasing lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.587 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Instance network_info: |[{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.591 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Start _get_guest_xml network_info=[{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.597 183134 WARNING nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.603 183134 DEBUG nova.virt.libvirt.host [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.604 183134 DEBUG nova.virt.libvirt.host [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.607 183134 DEBUG nova.virt.libvirt.host [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.608 183134 DEBUG nova.virt.libvirt.host [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.609 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.610 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.611 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.611 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.612 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.612 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.613 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.613 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.613 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.614 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.615 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.615 183134 DEBUG nova.virt.hardware [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.621 183134 DEBUG nova.virt.libvirt.vif [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:25:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1037943593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1037943593',id=10,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-o7sx85if',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:25:31Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=9c98ea59-db8f-40da-830b-351a58e44561,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.622 183134 DEBUG nova.network.os_vif_util [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converting VIF {"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.623 183134 DEBUG nova.network.os_vif_util [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.625 183134 DEBUG nova.objects.instance [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c98ea59-db8f-40da-830b-351a58e44561 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.654 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <uuid>9c98ea59-db8f-40da-830b-351a58e44561</uuid>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <name>instance-0000000a</name>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1037943593</nova:name>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:25:40</nova:creationTime>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:user uuid="3fd4ee63e94e4c3b9a3e4cefa7e0f626">tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member</nova:user>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:project uuid="58c1f09b90b6436c9e7154cd88c1ba5f">tempest-LiveAutoBlockMigrationV225Test-1955884209</nova:project>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        <nova:port uuid="67ee4400-6557-46b1-b66a-75f59eee46ea">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="serial">9c98ea59-db8f-40da-830b-351a58e44561</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="uuid">9c98ea59-db8f-40da-830b-351a58e44561</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.config"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:25:7d:54"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <target dev="tap67ee4400-65"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/console.log" append="off"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:25:40 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:25:40 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:25:40 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:25:40 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.655 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Preparing to wait for external event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.656 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.656 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.657 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.659 183134 DEBUG nova.virt.libvirt.vif [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:25:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1037943593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1037943593',id=10,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-o7sx85if',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:25:31Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=9c98ea59-db8f-40da-830b-351a58e44561,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.660 183134 DEBUG nova.network.os_vif_util [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converting VIF {"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.661 183134 DEBUG nova.network.os_vif_util [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.661 183134 DEBUG os_vif [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.662 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.663 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.663 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.666 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.667 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67ee4400-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.667 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap67ee4400-65, col_values=(('external_ids', {'iface-id': '67ee4400-6557-46b1-b66a-75f59eee46ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:7d:54', 'vm-uuid': '9c98ea59-db8f-40da-830b-351a58e44561'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:40 np0005601977 NetworkManager[55565]: <info>  [1769765140.6717] manager: (tap67ee4400-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.672 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.678 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.679 183134 INFO os_vif [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65')#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.789 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.790 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.791 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] No VIF found with MAC fa:16:3e:25:7d:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:25:40 np0005601977 nova_compute[183130]: 2026-01-30 09:25:40.792 183134 INFO nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Using config drive#033[00m
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.330 183134 INFO nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Creating config drive at /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.config#033[00m
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.336 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3szhx8m8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.460 183134 DEBUG oslo_concurrency.processutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3szhx8m8" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:41 np0005601977 kernel: tap67ee4400-65: entered promiscuous mode
Jan 30 04:25:41 np0005601977 NetworkManager[55565]: <info>  [1769765141.5025] manager: (tap67ee4400-65): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 30 04:25:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:41Z|00059|binding|INFO|Claiming lport 67ee4400-6557-46b1-b66a-75f59eee46ea for this chassis.
Jan 30 04:25:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:41Z|00060|binding|INFO|67ee4400-6557-46b1-b66a-75f59eee46ea: Claiming fa:16:3e:25:7d:54 10.100.0.7
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.507 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.516 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7d:54 10.100.0.7'], port_security=['fa:16:3e:25:7d:54 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=67ee4400-6557-46b1-b66a-75f59eee46ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.518 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 67ee4400-6557-46b1-b66a-75f59eee46ea in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 bound to our chassis#033[00m
Jan 30 04:25:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:41Z|00061|binding|INFO|Setting lport 67ee4400-6557-46b1-b66a-75f59eee46ea ovn-installed in OVS
Jan 30 04:25:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:41Z|00062|binding|INFO|Setting lport 67ee4400-6557-46b1-b66a-75f59eee46ea up in Southbound
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.519 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.521 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7#033[00m
Jan 30 04:25:41 np0005601977 systemd-machined[154431]: New machine qemu-5-instance-0000000a.
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.538 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4706aa-f477-4122-bc56-dbe9f8a1d7df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Jan 30 04:25:41 np0005601977 systemd-udevd[213340]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.568 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f26d8964-81ff-43f9-bb55-f65d0ad28ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.572 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[45bb7a57-4ef1-448a-9b18-0d27646ee907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 NetworkManager[55565]: <info>  [1769765141.5756] device (tap67ee4400-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:25:41 np0005601977 NetworkManager[55565]: <info>  [1769765141.5766] device (tap67ee4400-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.597 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c807a810-dcbe-44da-99cf-c03d8e260057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.614 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[356a7989-a2e6-4db5-9ecf-be3b25b8edc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213350, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.632 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8dce33c3-6e9c-4eb2-b119-ec4a1ea092da]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366992, 'tstamp': 366992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213351, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366995, 'tstamp': 366995}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213351, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.634 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.636 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:41 np0005601977 nova_compute[183130]: 2026-01-30 09:25:41.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.638 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e3ea2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.638 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.639 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e3ea2-50, col_values=(('external_ids', {'iface-id': '15b4d9a6-bad1-4bf8-a262-02e27eb8ea93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:41 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:41.639 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.018 183134 DEBUG nova.compute.manager [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-changed-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.019 183134 DEBUG nova.compute.manager [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Refreshing instance network info cache due to event network-changed-67ee4400-6557-46b1-b66a-75f59eee46ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.019 183134 DEBUG oslo_concurrency.lockutils [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.020 183134 DEBUG oslo_concurrency.lockutils [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.020 183134 DEBUG nova.network.neutron [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Refreshing network info cache for port 67ee4400-6557-46b1-b66a-75f59eee46ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.111 183134 DEBUG nova.compute.manager [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.111 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.111 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.112 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.112 183134 DEBUG nova.compute.manager [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Processing event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.112 183134 DEBUG nova.compute.manager [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.112 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.112 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.113 183134 DEBUG oslo_concurrency.lockutils [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.113 183134 DEBUG nova.compute.manager [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] No waiting events found dispatching network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.113 183134 WARNING nova.compute.manager [req-6bf3279c-5748-43b2-a1d3-528bad982a29 req-d8b060a1-ca95-4394-86ee-f25cc13318f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received unexpected event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.114 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.118 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765142.1172583, 11292039-d151-44b8-87a9-a58bbc82deaa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.118 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.120 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.124 183134 INFO nova.virt.libvirt.driver [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Instance spawned successfully.#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.124 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.168 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.170 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.181 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.182 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.182 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.182 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.183 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.183 183134 DEBUG nova.virt.libvirt.driver [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.224 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.278 183134 INFO nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Took 19.33 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.278 183134 DEBUG nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.334 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765142.3336854, 9c98ea59-db8f-40da-830b-351a58e44561 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.335 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] VM Started (Lifecycle Event)#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.351 183134 INFO nova.compute.manager [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Took 19.92 seconds to build instance.#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.359 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.363 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765142.3338253, 9c98ea59-db8f-40da-830b-351a58e44561 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.363 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.386 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.390 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.402 183134 DEBUG oslo_concurrency.lockutils [None req-c9019276-81a3-4bdc-863c-c3e9af867cad 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.432 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:25:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:42Z|00063|binding|INFO|Releasing lport 1e1a9288-6756-4ed5-a91d-8ad95ed4e3ae from this chassis (sb_readonly=0)
Jan 30 04:25:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:42Z|00064|binding|INFO|Releasing lport 3fa91002-4287-41d0-8a0e-e00f676bd48b from this chassis (sb_readonly=0)
Jan 30 04:25:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:42Z|00065|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:42 np0005601977 nova_compute[183130]: 2026-01-30 09:25:42.972 183134 DEBUG nova.network.neutron [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [{"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.016 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Releasing lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.045 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.046 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.048 183134 DEBUG oslo_concurrency.lockutils [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.053 183134 INFO nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 30 04:25:43 np0005601977 virtqemud[182587]: Domain id=3 name='instance-00000006' uuid=7a073e24-c800-4962-af5e-ff5400800f34 is tainted: custom-monitor
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.584 183134 DEBUG nova.network.neutron [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updated VIF entry in instance network info cache for port 67ee4400-6557-46b1-b66a-75f59eee46ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.585 183134 DEBUG nova.network.neutron [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating instance_info_cache with network_info: [{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:43 np0005601977 nova_compute[183130]: 2026-01-30 09:25:43.604 183134 DEBUG oslo_concurrency.lockutils [req-042848b6-442b-4344-92b6-f9efce5d3234 req-c55427de-9744-4163-a543-6d81dec575c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:43 np0005601977 podman[213361]: 2026-01-30 09:25:43.853117252 +0000 UTC m=+0.059913535 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:25:43 np0005601977 podman[213360]: 2026-01-30 09:25:43.877930605 +0000 UTC m=+0.091657830 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.061 183134 INFO nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.484 183134 DEBUG nova.compute.manager [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.485 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.485 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.486 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.486 183134 DEBUG nova.compute.manager [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Processing event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.487 183134 DEBUG nova.compute.manager [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.487 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.488 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.488 183134 DEBUG oslo_concurrency.lockutils [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.488 183134 DEBUG nova.compute.manager [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.489 183134 WARNING nova.compute.manager [req-1dcd889a-9e56-4004-818d-c66f8d481182 req-29954b99-057e-4c15-a2f2-92e2de9d42f2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.490 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.496 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765144.4956772, 9c98ea59-db8f-40da-830b-351a58e44561 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.496 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.497 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.509 183134 INFO nova.virt.libvirt.driver [-] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Instance spawned successfully.#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.509 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.536 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.541 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.544 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.545 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.545 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.545 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.545 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.546 183134 DEBUG nova.virt.libvirt.driver [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.586 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.661 183134 INFO nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Took 13.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.662 183134 DEBUG nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.683 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.734 183134 INFO nova.compute.manager [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Took 14.25 seconds to build instance.#033[00m
Jan 30 04:25:44 np0005601977 nova_compute[183130]: 2026-01-30 09:25:44.753 183134 DEBUG oslo_concurrency.lockutils [None req-c3b1950e-92e4-4904-a51e-78c50329e5f5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.066 183134 INFO nova.virt.libvirt.driver [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.073 183134 DEBUG nova.compute.manager [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.089 183134 DEBUG nova.objects.instance [None req-5aa9d438-87da-4bb5-af16-15ff0dea3798 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.986 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.987 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.987 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.987 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.988 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.989 183134 INFO nova.compute.manager [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Terminating instance#033[00m
Jan 30 04:25:45 np0005601977 nova_compute[183130]: 2026-01-30 09:25:45.990 183134 DEBUG nova.compute.manager [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:25:46 np0005601977 kernel: tapcc981c4c-eb (unregistering): left promiscuous mode
Jan 30 04:25:46 np0005601977 NetworkManager[55565]: <info>  [1769765146.0256] device (tapcc981c4c-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.031 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:46Z|00066|binding|INFO|Releasing lport cc981c4c-eb1f-420a-9f46-bfb505a4df87 from this chassis (sb_readonly=0)
Jan 30 04:25:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:46Z|00067|binding|INFO|Setting lport cc981c4c-eb1f-420a-9f46-bfb505a4df87 down in Southbound
Jan 30 04:25:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:46Z|00068|binding|INFO|Removing iface tapcc981c4c-eb ovn-installed in OVS
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.033 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.036 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.038 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:8e:0a 10.100.0.8'], port_security=['fa:16:3e:bf:8e:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f7f740b-9b5a-4141-8bd5-a4c35a68eab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '698009b6-3fdf-4d28-bd4f-05dad4fe7608 f44c0181-adb8-44dc-b12e-9a42af2bf3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b86d5d91-f387-44cf-8812-d69fa2c0ba06, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=cc981c4c-eb1f-420a-9f46-bfb505a4df87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.040 104706 INFO neutron.agent.ovn.metadata.agent [-] Port cc981c4c-eb1f-420a-9f46-bfb505a4df87 in datapath 25da4a49-e507-4d4f-9263-ce5e8dbdc544 unbound from our chassis#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.042 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25da4a49-e507-4d4f-9263-ce5e8dbdc544, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.045 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e06526-66ef-4311-9b05-bdd19f215745]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.046 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544 namespace which is not needed anymore#033[00m
Jan 30 04:25:46 np0005601977 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 30 04:25:46 np0005601977 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 17.761s CPU time.
Jan 30 04:25:46 np0005601977 systemd-machined[154431]: Machine qemu-2-instance-00000003 terminated.
Jan 30 04:25:46 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [NOTICE]   (212072) : haproxy version is 2.8.14-c23fe91
Jan 30 04:25:46 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [NOTICE]   (212072) : path to executable is /usr/sbin/haproxy
Jan 30 04:25:46 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [WARNING]  (212072) : Exiting Master process...
Jan 30 04:25:46 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [ALERT]    (212072) : Current worker (212074) exited with code 143 (Terminated)
Jan 30 04:25:46 np0005601977 neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544[212068]: [WARNING]  (212072) : All workers exited. Exiting... (0)
Jan 30 04:25:46 np0005601977 systemd[1]: libpod-566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893.scope: Deactivated successfully.
Jan 30 04:25:46 np0005601977 podman[213427]: 2026-01-30 09:25:46.177655819 +0000 UTC m=+0.041679634 container died 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:25:46 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893-userdata-shm.mount: Deactivated successfully.
Jan 30 04:25:46 np0005601977 systemd[1]: var-lib-containers-storage-overlay-77f74b7f5980594d92733de27819048695e13e091a12d29c346544b8ee891783-merged.mount: Deactivated successfully.
Jan 30 04:25:46 np0005601977 podman[213427]: 2026-01-30 09:25:46.215664416 +0000 UTC m=+0.079688211 container cleanup 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:25:46 np0005601977 systemd[1]: libpod-conmon-566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893.scope: Deactivated successfully.
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.230 183134 INFO nova.virt.libvirt.driver [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Instance destroyed successfully.#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.231 183134 DEBUG nova.objects.instance [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.249 183134 DEBUG nova.virt.libvirt.vif [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2141171659',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=3,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBTozGo/30MBHMhC0qqyUiXOz453HTS7rAR7GWNweWLgqYHZeswuKN04I0U5cVjAR8MaACmZTSTwsv0g1uGvof8aM0e7Q3swPJgiJmxXuknrNVi52vMRGO+/vfh0PdpnEw==',key_name='tempest-TestSecurityGroupsBasicOps-1644744006',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:23:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-gwxz4owx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:23:05Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=7f7f740b-9b5a-4141-8bd5-a4c35a68eab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.250 183134 DEBUG nova.network.os_vif_util [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "address": "fa:16:3e:bf:8e:0a", "network": {"id": "25da4a49-e507-4d4f-9263-ce5e8dbdc544", "bridge": "br-int", "label": "tempest-network-smoke--1938399564", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc981c4c-eb", "ovs_interfaceid": "cc981c4c-eb1f-420a-9f46-bfb505a4df87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.251 183134 DEBUG nova.network.os_vif_util [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.251 183134 DEBUG os_vif [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.254 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.254 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc981c4c-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.256 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.257 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.260 183134 INFO os_vif [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8e:0a,bridge_name='br-int',has_traffic_filtering=True,id=cc981c4c-eb1f-420a-9f46-bfb505a4df87,network=Network(25da4a49-e507-4d4f-9263-ce5e8dbdc544),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc981c4c-eb')#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.261 183134 INFO nova.virt.libvirt.driver [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Deleting instance files /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6_del#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.262 183134 INFO nova.virt.libvirt.driver [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Deletion of /var/lib/nova/instances/7f7f740b-9b5a-4141-8bd5-a4c35a68eab6_del complete#033[00m
Jan 30 04:25:46 np0005601977 podman[213473]: 2026-01-30 09:25:46.285974443 +0000 UTC m=+0.049294076 container remove 566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.289 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3185c5b5-73f9-4627-adb4-066abade1953]: (4, ('Fri Jan 30 09:25:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544 (566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893)\n566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893\nFri Jan 30 09:25:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544 (566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893)\n566b9932ef372ce98da752469eafc3ffacdce4766fbbe7418ab2f6a767339893\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.291 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c529f211-ab85-4305-80a7-2471cc9727c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.292 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25da4a49-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.293 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 kernel: tap25da4a49-e0: left promiscuous mode
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.298 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.301 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b0920205-bed3-4dd7-a5d6-c5493efe7a0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.314 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ff02d99e-ab69-495d-88d1-5d6371cf68bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.315 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[568572cd-c29a-44ea-85e6-9edcd126cd0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.332 183134 INFO nova.compute.manager [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.333 183134 DEBUG oslo.service.loopingcall [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.333 183134 DEBUG nova.compute.manager [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:25:46 np0005601977 nova_compute[183130]: 2026-01-30 09:25:46.334 183134 DEBUG nova.network.neutron [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.340 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b32b5b80-61b1-4778-be7e-2a4d16152f80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351040, 'reachable_time': 23838, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.343 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-25da4a49-e507-4d4f-9263-ce5e8dbdc544 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:25:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:46.343 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[ebaeaac4-549f-4fba-a012-6d5a677d854e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:25:46 np0005601977 systemd[1]: run-netns-ovnmeta\x2d25da4a49\x2de507\x2d4d4f\x2d9263\x2dce5e8dbdc544.mount: Deactivated successfully.
Jan 30 04:25:47 np0005601977 nova_compute[183130]: 2026-01-30 09:25:47.805 183134 DEBUG nova.compute.manager [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:47 np0005601977 nova_compute[183130]: 2026-01-30 09:25:47.807 183134 DEBUG nova.compute.manager [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing instance network info cache due to event network-changed-cc981c4c-eb1f-420a-9f46-bfb505a4df87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:25:47 np0005601977 nova_compute[183130]: 2026-01-30 09:25:47.807 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:47 np0005601977 nova_compute[183130]: 2026-01-30 09:25:47.808 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:47 np0005601977 nova_compute[183130]: 2026-01-30 09:25:47.808 183134 DEBUG nova.network.neutron [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Refreshing network info cache for port cc981c4c-eb1f-420a-9f46-bfb505a4df87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.074 183134 DEBUG nova.network.neutron [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.142 183134 INFO nova.compute.manager [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Took 1.81 seconds to deallocate network for instance.#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.256 183134 INFO nova.network.neutron [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Port cc981c4c-eb1f-420a-9f46-bfb505a4df87 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.257 183134 DEBUG nova.network.neutron [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.283 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.283 183134 DEBUG nova.compute.manager [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-unplugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.284 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.284 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.284 183134 DEBUG oslo_concurrency.lockutils [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.284 183134 DEBUG nova.compute.manager [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] No waiting events found dispatching network-vif-unplugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.285 183134 DEBUG nova.compute.manager [req-ce3b6ba9-4448-44e7-b7ae-d8392cd24538 req-4f52cc84-bc1b-45ce-9231-f865f1addf3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-unplugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.339 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.340 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.537 183134 DEBUG nova.compute.provider_tree [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.597 183134 DEBUG nova.scheduler.client.report [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:25:48 np0005601977 nova_compute[183130]: 2026-01-30 09:25:48.971 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:49 np0005601977 nova_compute[183130]: 2026-01-30 09:25:49.086 183134 INFO nova.scheduler.client.report [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6#033[00m
Jan 30 04:25:49 np0005601977 nova_compute[183130]: 2026-01-30 09:25:49.388 183134 DEBUG oslo_concurrency.lockutils [None req-cd7b2d86-c873-4641-a481-c6ad556adf1c 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:49 np0005601977 nova_compute[183130]: 2026-01-30 09:25:49.685 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:49 np0005601977 podman[213490]: 2026-01-30 09:25:49.85256504 +0000 UTC m=+0.074208141 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.303 183134 DEBUG nova.compute.manager [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.304 183134 DEBUG oslo_concurrency.lockutils [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.304 183134 DEBUG oslo_concurrency.lockutils [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.305 183134 DEBUG oslo_concurrency.lockutils [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7f7f740b-9b5a-4141-8bd5-a4c35a68eab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.305 183134 DEBUG nova.compute.manager [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] No waiting events found dispatching network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.306 183134 WARNING nova.compute.manager [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received unexpected event network-vif-plugged-cc981c4c-eb1f-420a-9f46-bfb505a4df87 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:25:50 np0005601977 nova_compute[183130]: 2026-01-30 09:25:50.307 183134 DEBUG nova.compute.manager [req-b06e407c-bf86-415e-8af6-11cb598e4ad0 req-d057115a-7e4e-4d89-91ad-43e8382cf666 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Received event network-vif-deleted-cc981c4c-eb1f-420a-9f46-bfb505a4df87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.256 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.948 183134 INFO nova.compute.manager [None req-f1a57a15-a37a-4dad-b054-16f4083c0fb6 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Pausing#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.949 183134 DEBUG nova.objects.instance [None req-f1a57a15-a37a-4dad-b054-16f4083c0fb6 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'flavor' on Instance uuid 9c98ea59-db8f-40da-830b-351a58e44561 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.991 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765151.9909933, 9c98ea59-db8f-40da-830b-351a58e44561 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.992 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:25:51 np0005601977 nova_compute[183130]: 2026-01-30 09:25:51.993 183134 DEBUG nova.compute.manager [None req-f1a57a15-a37a-4dad-b054-16f4083c0fb6 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:52 np0005601977 nova_compute[183130]: 2026-01-30 09:25:52.022 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:25:52 np0005601977 nova_compute[183130]: 2026-01-30 09:25:52.026 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:25:52 np0005601977 nova_compute[183130]: 2026-01-30 09:25:52.069 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 30 04:25:53 np0005601977 nova_compute[183130]: 2026-01-30 09:25:53.641 183134 DEBUG nova.compute.manager [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:25:53 np0005601977 nova_compute[183130]: 2026-01-30 09:25:53.642 183134 DEBUG nova.compute.manager [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing instance network info cache due to event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:25:53 np0005601977 nova_compute[183130]: 2026-01-30 09:25:53.642 183134 DEBUG oslo_concurrency.lockutils [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:25:53 np0005601977 nova_compute[183130]: 2026-01-30 09:25:53.643 183134 DEBUG oslo_concurrency.lockutils [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:25:53 np0005601977 nova_compute[183130]: 2026-01-30 09:25:53.643 183134 DEBUG nova.network.neutron [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:25:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:54Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:c6:8c 10.100.0.5
Jan 30 04:25:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:54Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:c6:8c 10.100.0.5
Jan 30 04:25:54 np0005601977 nova_compute[183130]: 2026-01-30 09:25:54.687 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.377 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.449 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'name': 'tempest-TestGettingAddress-server-2130032467', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000009', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'hostId': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7a073e24-c800-4962-af5e-ff5400800f34', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'hostId': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.455 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9c98ea59-db8f-40da-830b-351a58e44561', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'hostId': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.458 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 11292039-d151-44b8-87a9-a58bbc82deaa / tap72592aff-3a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.459 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.462 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7a073e24-c800-4962-af5e-ff5400800f34 / tapfb902761-f0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.462 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9c98ea59-db8f-40da-830b-351a58e44561 / tap67ee4400-65 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.466 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'caf10014-f3ba-4d3b-8893-a474a7928176', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.456559', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'adc677f2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '02b5a4deb75025d217c378154aff713df6b24fc64d44bf6f6112ed98a4e81f2d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.456559', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'adc70604-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': 'c3dfad7efd4da2697ba47fc3b23e56f9be421b990832246f78c9c787a2facaac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.456559', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'adc79e34-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '11536c7dc1086969335ab8f79e470e7e828a3a7651814e7831f3716f5ff0090e'}]}, 'timestamp': '2026-01-30 09:25:55.467611', '_unique_id': '603ee3dcf7794838899f63ff6845e567'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.483 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.484 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.485 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.496 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 30617600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.497 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.510 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.510 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96e06e59-9e88-4960-a555-b948dff83430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adca3748-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': '81de8f6fddfbb3e9d5ce3889f2e9d99da609858d0a13048f4c3edaada176f0c1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adca52c8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': '65fd31c6dc32d9fad20b3fec54d279071a02a106cd7958ad17c6da56d7e2a4d6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30617600, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adcc34da-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': 'd9345b7e63ac2ca21f9db3b54e065c454e31fdf635bae72f7b455645307c4996'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adcc447a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': 'ea5dc9fa610de6c3a32d0b4aeb2ea424c99bee3a59da4297305963202f3b6bb1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adce379e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': 'f4f10234ebaa4422bb0b9c894cdceb7b1e40b7c0323e7cb3308ae22ebbc6492d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.472262', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: ral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adce482e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': 'b64624c3e750c191143d6fbbad69ca03f532d7166a524ec3e78e5f969ce6ea35'}]}, 'timestamp': '2026-01-30 09:25:55.511055', '_unique_id': '5b35cb7d9c0d4053addf1461a360f194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.513 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.513 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.514 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.514 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.514 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b9993b2-f07d-4550-9694-68c21bf49582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.514159', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'adcecfb0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '2198abd3323c1ffe63124fef1cdf645bc6ee2a498dcdcea18294294541afbc16'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.514159', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'adcedcb2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '1048fdb2ef6586464932468ca5d58a074ac091ff4f01de3db82ed428628c20e4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.514159', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'adcee806-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '584d9267ea4e1e0b640acec2d3499634bd58c4b04b310c30ee703af02d77b55f'}]}, 'timestamp': '2026-01-30 09:25:55.515131', '_unique_id': 'badc6e5a5bc84b49b26bf9310a3bff49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.544 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.bytes volume: 72777728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.545 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.566 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.567 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.578 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.579 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.609 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.610 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfacbc53-be81-41de-b741-d657bede68ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72777728, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'add37e0c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '67352b63d5d1eff30bc3da2fe474bc6346acacaaff96d16740d39663aef36dad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'add399e6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': 'ac5fc7b7ba9aebdd40f7f1698f74eab7d01f87632d141a9fcd6ce9c20928547b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'add8af6c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': 'a75869b1276fa27defca6acec0047b7d269df6a622b66bea60d249d7f61d8f44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'add8c506-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': 'd5710bc62230ab5fa8991bf47635d863d9d181c77e7af41a99b6e27f0dfa330a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addd649e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '2f89b7b886294a8ace2065e4ae08cfd0b20ea5d569ed0ec32c4ea31e3dca8e5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.516702', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_ur
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: , 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addd92fc-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '9814f7ab42c238a8704c8633be78131977a3b4366e87bfd0d2e1aaccd9da2638'}]}, 'timestamp': '2026-01-30 09:25:55.611273', '_unique_id': '2123edae2a2a455eb03400b44f12955c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.613 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.bytes volume: 30726656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.613 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.613 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.614 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.614 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.614 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0325bfdf-df76-4099-9961-0165bb1381b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30726656, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addde89c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '3428e702e60389aca3a6b77821afbe4046c3fe9b05b6336fde47afe2743ba220'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adddf274-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '9459138de320eb409ff38577b538895543e800d6471a7c0ec47ecf265315b8b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adde0188-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '97a95b751a678d5006cc0eb8f78c86a942250cbe8fd4b8fcbd5d180a37fb4bab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adde0a48-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': 'a4bbbc9dee02dc25bf5760d315a443461d4e3bd54dbc8a655c3f702d00164fba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adde133a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '10a49847a652c0420362af4a8ddc4d31fa2a6207e9d662262df05830f02f87c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.613095', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'ima
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: sk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adde1a56-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '98834b48f05cb19ad7d2adf2b9c3f3efd7be6d9924c53843774d2ebc8349db0b'}]}, 'timestamp': '2026-01-30 09:25:55.614704', '_unique_id': '33b9e399fc31431ca6921264e8add228'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.616 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.616 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.616 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.616 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70a7f7f8-7494-40fc-96cd-93065f9971fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.616285', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'adde61aa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': 'ea81157072e219e75ba39a9fa51a804b81e8ffeab157d664a5cae1f19eb78419'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.616285', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'adde695c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '76db63f9efee22b44952b8b651ca4b4d05ed3831bffb898c04a510115b3aa380'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.616285', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'adde7104-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '93349a4d689f9d6d64cf181dfe14f7e4e7e746de7d0d829a15e5a92c960fe8fa'}]}, 'timestamp': '2026-01-30 09:25:55.616909', '_unique_id': 'e9b40bfa7d014689a1a7606f0de4ff34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.617 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.618 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.latency volume: 1987663760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.618 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.618 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.618 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.618 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95e2a527-eb4c-49be-acba-13e0a1a34efb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1987663760, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addea426-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': 'fb3bf72b99ebe370dc0e50ed4a5d65df20fa8e9e305a215d6efd29ab4248eb0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addeaee4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '5f3bd7173d327c2ce0be611733d418da23b205234c8774a7b55b95c7a42438d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addeb6b4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': 'e426d672542b8e2ee460a6b0c4cd1cb49035843e1571f8cb9ddc1974e56648c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addebd94-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '7e1cb6b17c182e789403432e536d926b85bcb34d9b35198566be312f475cb876'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addec46a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': 'ce6d478a57aca3b486dbf1357006e905957fa295ed4e6fb47a1d7bd2626cb746'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.617975', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: ': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addecc30-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': 'fa6db870fc40c115970eb7f75afaad1c36f027203cf41d7394d7f77b89e6b622'}]}, 'timestamp': '2026-01-30 09:25:55.619243', '_unique_id': 'd9a4a8f2aa1741e99ee74fe0a60e77d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.620 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.620 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.620 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.621 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.621 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.621 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b93182f-7d75-489e-ae89-48bfa57cc5dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addf0358-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '2fe81790e145f02e6060073ca8c7c26e27f1f399eccedaae30234d136c5dd366'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addf0bfa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '62b6e821bf6e5f3eb2ab4187019d9e88bd4fb1a129c242ad465cb0d4c14cf568'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addf13c0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '734e5adb936047b0808ad7bb48c75bcf040d063d35ced421a12f035c9817c806'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addf1a8c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '0041404f54cee891afc1047b2ccb71c5974fbdd407c95ef36686bb50f6a8afc0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'addf24f0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '0c7a9c3e510ddead88523c6ab67f8beeecc532b74e01473bbba65711da2e8f0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.620423', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'addf2c34-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': 'a8bd5458bf9bc728bdebbdf4f8ee3f69dabb4dba7c0ecb7e00a2af9df15aa56b'}]}, 'timestamp': '2026-01-30 09:25:55.621714', '_unique_id': '78b9d2a114a547abb27b313ce8d9624a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.622 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.outgoing.bytes volume: 1418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.623 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.623 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc5256e7-7266-4e2f-8ac1-1809ea1741a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1418, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.622923', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'addf64b0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '52a7607e6b4ef5d0503c2e44860bf367d4e33a38f7b65bdd398f8b66d0a0332e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.622923', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'addf6f64-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': 'edb98c6c7f52407372d255ebb55779c0031f8f51d13c67b4df9c3b722ccd4601'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.622923', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'addf7914-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '05d1596bc304119fc3674f00c0208a89fd7b7ff03d6b20daaf7a6d00835729ab'}]}, 'timestamp': '2026-01-30 09:25:55.623665', '_unique_id': '7a2ac8390e374bfe90c3f05d299cbca8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.incoming.bytes volume: 1692 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.624 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes volume: 706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.incoming.bytes volume: 132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '584d34c6-9365-4885-ad2e-c0ae640e2b70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1692, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.624721', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'addfac18-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '4c827a09aedc2483156c26b76223b98353882c25ba4baaf44d72571fd13ecc5a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 706, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.624721', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'addfb46a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '247d64ce8df5167da7f5e01c6ba8db5abc6503d268935e32508d67bde2e6d536'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 132, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.624721', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'addfbc94-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '7868cfb70821bf045f9e04c1fe0ae26dec19f0ffe9b0d038e0bfa12f7122f214'}]}, 'timestamp': '2026-01-30 09:25:55.625382', '_unique_id': '48e49844ca7a4dc2b055d46473da95ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.612 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.626 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.626 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.626 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.626 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.626 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51f88501-8941-4614-92c7-f5cc023860f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.626870', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'addffed4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '30f1bc2e5a0330434db36b95c621c4b9ff5550edad3cfe27f53476e7a193171b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.626870', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'ade0074e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '0c86fc43381321ecf517032385e682d4820f654d27d0f591d5e302559fe9f412'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.626870', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'ade00f0a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '3dc61d5e78eedad0ce7902536fca7ea2674a947b963387ec839bd36541650812'}]}, 'timestamp': '2026-01-30 09:25:55.627489', '_unique_id': '841ab0f4983844799c2026d187da6c76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.628 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.628 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.requests volume: 1111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.628 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.628 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.629 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.629 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.629 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6609924-6f51-4a74-94a4-7428f2fcc906', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1111, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade03f3e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '9c7aa7b3631120c415bfc8a1e012041f8ac9ed626f58bf6df566d1a687db521e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade04894-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '603a12815967b909a5afc5a4341097c77613fd33baa0347348eea22e40a8dd7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade051cc-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': 'd8584249fba11452a3ae4d26e9cfde07848a600f9af9a2b5b60f870c5a02ce7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade059b0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '2653b56e301cbc3a6ad0cb7482c4d20195b1cea4865cf9973a5dbbd916b661cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade060b8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '680fa1ef69b1d31d6d093f55215bfdb4b8597fdec3d964f73006f2b554098faf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.628520', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7c
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade0698c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': 'b726325f82cfe83a8317b5076ed2d4cba4d5c20c1ba3ad342e97ac155cf415a0'}]}, 'timestamp': '2026-01-30 09:25:55.629866', '_unique_id': '266b398cec164037a4dd760fc5d05348'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.615 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.630 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.631 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.631 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.631 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.631 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.631 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac925142-5245-4bd3-86d3-3b8999af2e66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade0a5fa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': '55a770d8f0e271b32754f6bc1326bffd0bfa1b7fc196694fe59f770be12770e5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade0afaa-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': 'c05a1bfd1a95509150ea5280a65b235e7ab8b8409a63a98a8f9c18ad5b2ecbd8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade0b716-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': '7c5fb5adde7f4bad87b615c25da83b8fd01319259f16a5db21842941e87df305'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade0bde2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': '87db76d32637ad2cdf510be9e818e13d12508f6903b3a3aabf25abcbd291e6d3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade0c5a8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': 'f079167dee8ddfffbdaeab9b32e0f87d59fb073d3348d04c7db561fd36287f57'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.631072', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'ar
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade0d020-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': '56ff1af7604adc5080865f1fbe0f9bcf397eff6d6f1f946b194f8db546fdacbc'}]}, 'timestamp': '2026-01-30 09:25:55.632450', '_unique_id': 'c74e191b8a9e4da19307715fc1267edd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.633 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.633 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.633 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.634 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.incoming.packets volume: 2 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.635 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54111d18-a7f7-4f03-8388-e97eede39185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.633616', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'ade106ee-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': '81a67bb444e9859949807a8cac564708fea5b341fb1bcdcdc225b67147869065'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.633616', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'ade115e4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '58fcfdf869032ad2fdd6e6828ccc3d0a11b972d5d4f0e82042c748a6931a6c59'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 2, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.633616', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'ade12070-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '45d5d40ffc517b8bf2021ec24efbef5b7d09e556a1d699654f60c9d83c3b0262'}]}, 'timestamp': '2026-01-30 09:25:55.634525', '_unique_id': 'a0694d5715624ccd9a86579014088d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.635 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.636 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.636 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.636 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.636 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37c81eb7-3669-493d-a68e-a660a76c2a09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade155ae-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': 'c1d4f581f76e81fb92b1f01168e7021d7fd7b4c3fa5334e29880d92d3bb56f2e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade15cfc-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.867232695, 'message_signature': 'ca71e986edbb2314d8eb36337c6909bfda7f7d5f4ff5f7142fb5bd408086d6af'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade16698-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': '363407acb6d62fc48f3da81899ff7118977614cb195eb279b77485db335c2307'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ade16f8a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.880234504, 'message_signature': '446a50366d4f284b1c5bfe61b6ec0dafc6e691fb3b21b72d79ffff11ded63b04'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ade176c4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': '34086aa10b1b3c2285ed5758c03f30e79db76a3ff3df084c92917ef03bec2b4f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.635653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: k_name': 'sda'}, 'message_id': 'ade17d90-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.892724737, 'message_signature': '9df11e0e0d451d3b193b17c80523212ff5b1091f1aa56930fdbed654f9d0602a'}]}, 'timestamp': '2026-01-30 09:25:55.636865', '_unique_id': '0f5653c938714299a39758c6307e10a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.638 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.638 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.638 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.638 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8faa3840-3947-4567-b6c0-1cde5c5c0d87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.638186', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'ade1b9b8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': 'ce1cd399e6dac3a0983e2d39f5fe149a0788d46f37518b627d79c6a5ae8ed6d5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.638186', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'ade1c3cc-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '64d1e57eab5e8a7fca1969bd2cc484b66752b0dd804c1125bb4478c5f6f87520'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.638186', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'ade1cc64-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '12370b79fc874c54f8bad9d4780b3c7c11dfd0dfa2d7e945d65649ae1eab77ce'}]}, 'timestamp': '2026-01-30 09:25:55.638932', '_unique_id': 'd4d48a9c54744547a8f8dc391a9ee786'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.640 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.640 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.640 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca5e194-f20d-4721-8e43-0fcbf606a7ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.640127', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'ade205a8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': 'a6f3f867586f606189c95711301ad3829f62d2be81c80129b4608668f44debee'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.640127', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'ade20efe-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': '9e12a557890926810c5759f1a3a55823a8cfdb4a21232f792c7fcad10e65f2ff'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.640127', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'ade216f6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': '1c5606df8aa79f6a824a66e5a2137f65767f7cc309496071b2ec5016b8c3dc09'}]}, 'timestamp': '2026-01-30 09:25:55.640839', '_unique_id': 'd6e57f7c964346a0bab0c7694b913aa4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.642 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.642 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.642 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d883b9c-5322-4a9c-a3df-c8b39c726af3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000009-11292039-d151-44b8-87a9-a58bbc82deaa-tap72592aff-3a', 'timestamp': '2026-01-30T09:25:55.642076', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'tap72592aff-3a', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c6:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72592aff-3a'}, 'message_id': 'ade25328-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.851468456, 'message_signature': 'e4bb76a2d5d7f7cb162de48f46c3db0542d058ac4052eca7b940d55b4e85c42e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:25:55.642076', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'ade25bc0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.855093992, 'message_signature': 'e466ca156694d42e95ed25a46bb1740ef6d20f46d81431bb37848de460fd8fe0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-0000000a-9c98ea59-db8f-40da-830b-351a58e44561-tap67ee4400-65', 'timestamp': '2026-01-30T09:25:55.642076', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'tap67ee4400-65', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:7d:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap67ee4400-65'}, 'message_id': 'ade265de-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.858531622, 'message_signature': 'c2b7633c3748a6d0620e7badfc70df3c1034231da60e150f59f0d056aa853ce4'}]}, 'timestamp': '2026-01-30 09:25:55.642864', '_unique_id': 'd0c0087c6aae4f54b174cae0e1bf74bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.619 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.644 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.658 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/memory.usage volume: 40.44140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.622 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.674 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/memory.usage volume: 42.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.630 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.693 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.693 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9c98ea59-db8f-40da-830b-351a58e44561: ceilometer.compute.pollsters.NoVolumeException
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0b70be7-8189-4c05-b9b2-c377af1ae4a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44140625, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'timestamp': '2026-01-30T09:25:55.644199', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ade4e0a2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3687.053423886, 'message_signature': 'fecc798429d1c5e4b528f97696591b583d09d30ebe171117b76012f5856b12d9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4140625, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:25:55.644199', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ade742d4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3687.069111273, 'message_signature': '40e4fe8cdb483dd69317b3637b1370c3bf08e7b150abbb69bd83f6de9c7b8c9a'}]}, 'timestamp': '2026-01-30 09:25:55.694131', '_unique_id': 'eb914cee16e24c919cacfc6f0c25e816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.696 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.latency volume: 447985273 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.696 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/disk.device.read.latency volume: 42010175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.696 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.696 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.696 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.latency volume: 356690004 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/disk.device.read.latency volume: 3955445 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcbaa0a2-e471-4e1d-b834-06f667755566', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 447985273, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-vda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adea8e58-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '2342d04ad4576e3bfbdd2b38a80a1802d6fd1e9ba904593f1355880d7d761b0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42010175, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa-sda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adea96f0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.911609117, 'message_signature': '192b684557f94c950532f1fc0a772048fa07444cc86a67c226446789a441f276'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adea9e20-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '1d72b2624b5612d1be9c135bbfb8fb9851545c05d6d99757a63dd644bde2472d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adeaa532-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.940927031, 'message_signature': '3aab6f2f241422833036a316bca47e253456431cbf599634785b25902e40f8a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 356690004, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-vda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'adeaac44-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': 'ad9255cd75dba05d7a93ffee969dc4092718aaa24a47c4a446be76d9a36a581c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3955445, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561-sda', 'timestamp': '2026-01-30T09:25:55.696050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'adeab342-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3686.974744795, 'message_signature': '0bbdfe9e9c9151b414539f6de68af4f8c35ffa60dd80098451320017b469c8f7'}]}, 'timestamp': '2026-01-30 09:25:55.697242', '_unique_id': '82d929bbac3d45aa9a2367b0c6d47b71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.632 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2130032467>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1403442336>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1037943593>]
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 DEBUG ceilometer.compute.pollsters [-] 11292039-d151-44b8-87a9-a58bbc82deaa/cpu volume: 11010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.698 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/cpu volume: 110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 DEBUG ceilometer.compute.pollsters [-] 9c98ea59-db8f-40da-830b-351a58e44561/cpu volume: 7250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a06694d4-0a57-4e2d-b531-5c94098096be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11010000000, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'timestamp': '2026-01-30T09:25:55.698696', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-2130032467', 'name': 'instance-00000009', 'instance_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'adeaf438-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3687.053423886, 'message_signature': 'cd6aee880e5f51684803b3b12362ccfcb243e5ab7de7207e15406619fac40c3d'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110000000, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:25:55.698696', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'adeafb9a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3687.069111273, 'message_signature': 'c748064e799bb00b7999e9504830feaa99dc66424cb177c6117bf9d1473e7364'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7250000000, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'timestamp': '2026-01-30T09:25:55.698696', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1037943593', 'name': 'instance-0000000a', 'instance_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'adeb02ac-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3687.088530378, 'message_signature': '358b6aa0f221402daef8a29cbaa2a8cb58881e64a52192cde31e29bbaaa7ab0c'}]}, 'timestamp': '2026-01-30 09:25:55.699272', '_unique_id': '830d12044f104d8b8a7b475ff9e20f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:25:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:25:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.702 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.703 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.637 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:25:55.697 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.764 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.769 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.811 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.812 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.862 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.991 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.992 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5219MB free_disk=73.30085372924805GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.992 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:55 np0005601977 nova_compute[183130]: 2026-01-30 09:25:55.992 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.130 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.130 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 11292039-d151-44b8-87a9-a58bbc82deaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.131 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 9c98ea59-db8f-40da-830b-351a58e44561 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.131 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.131 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.249 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.258 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.267 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.304 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.305 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:56Z|00069|binding|INFO|Releasing lport 3fa91002-4287-41d0-8a0e-e00f676bd48b from this chassis (sb_readonly=0)
Jan 30 04:25:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:25:56Z|00070|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.613 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:56 np0005601977 podman[213548]: 2026-01-30 09:25:56.872505708 +0000 UTC m=+0.081113612 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.990 183134 DEBUG nova.network.neutron [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updated VIF entry in instance network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:25:56 np0005601977 nova_compute[183130]: 2026-01-30 09:25:56.990 183134 DEBUG nova.network.neutron [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:25:57 np0005601977 nova_compute[183130]: 2026-01-30 09:25:57.025 183134 DEBUG oslo_concurrency.lockutils [req-98f9f285-b572-4bdb-8e8d-dfa1bc49ef2e req-ab634d69-985d-40e0-939c-fcecf0387596 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:25:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:57.378 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:57.379 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:25:57.380 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:57 np0005601977 nova_compute[183130]: 2026-01-30 09:25:57.614 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:25:59 np0005601977 nova_compute[183130]: 2026-01-30 09:25:59.690 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.305 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.635 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.636 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.636 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:26:00 np0005601977 nova_compute[183130]: 2026-01-30 09:26:00.637 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11292039-d151-44b8-87a9-a58bbc82deaa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:01 np0005601977 nova_compute[183130]: 2026-01-30 09:26:01.228 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765146.2273593, 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:01 np0005601977 nova_compute[183130]: 2026-01-30 09:26:01.229 183134 INFO nova.compute.manager [-] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:26:01 np0005601977 nova_compute[183130]: 2026-01-30 09:26:01.255 183134 DEBUG nova.compute.manager [None req-8b7a2f54-7f25-4d26-b502-2019156b84e9 - - - - - -] [instance: 7f7f740b-9b5a-4141-8bd5-a4c35a68eab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:01 np0005601977 nova_compute[183130]: 2026-01-30 09:26:01.259 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:03 np0005601977 nova_compute[183130]: 2026-01-30 09:26:03.272 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:03 np0005601977 nova_compute[183130]: 2026-01-30 09:26:03.981 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.293 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.319 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.320 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.320 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.321 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.321 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.321 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.321 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:26:04 np0005601977 nova_compute[183130]: 2026-01-30 09:26:04.693 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:05 np0005601977 nova_compute[183130]: 2026-01-30 09:26:05.343 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:05 np0005601977 podman[213572]: 2026-01-30 09:26:05.831965264 +0000 UTC m=+0.051179251 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:26:05 np0005601977 podman[213571]: 2026-01-30 09:26:05.852575544 +0000 UTC m=+0.074914492 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, vendor=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal)
Jan 30 04:26:06 np0005601977 nova_compute[183130]: 2026-01-30 09:26:06.261 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:06 np0005601977 nova_compute[183130]: 2026-01-30 09:26:06.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:06 np0005601977 nova_compute[183130]: 2026-01-30 09:26:06.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:09 np0005601977 nova_compute[183130]: 2026-01-30 09:26:09.695 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.081 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.082 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.130 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.236 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.237 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.245 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.246 183134 INFO nova.compute.claims [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.497 183134 DEBUG nova.compute.provider_tree [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.549 183134 DEBUG nova.scheduler.client.report [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.582 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.583 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.659 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.660 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.696 183134 INFO nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.720 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.822 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.823 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.823 183134 INFO nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Creating image(s)#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.824 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.824 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.824 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.837 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.887 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.888 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.888 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.899 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.935 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.936 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.936 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.936 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.937 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.938 183134 INFO nova.compute.manager [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Terminating instance#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.939 183134 DEBUG nova.compute.manager [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.942 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.942 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.964 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.965 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.966 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:10 np0005601977 kernel: tap72592aff-3a (unregistering): left promiscuous mode
Jan 30 04:26:10 np0005601977 NetworkManager[55565]: <info>  [1769765170.9864] device (tap72592aff-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.990 183134 DEBUG nova.policy [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:26:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:10Z|00071|binding|INFO|Releasing lport 72592aff-3a8e-4d04-a6b8-d59e2c43fade from this chassis (sb_readonly=0)
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.993 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:10Z|00072|binding|INFO|Setting lport 72592aff-3a8e-4d04-a6b8-d59e2c43fade down in Southbound
Jan 30 04:26:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:10Z|00073|binding|INFO|Removing iface tap72592aff-3a ovn-installed in OVS
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.996 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.999 183134 DEBUG nova.compute.manager [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.999 183134 DEBUG nova.compute.manager [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing instance network info cache due to event network-changed-72592aff-3a8e-4d04-a6b8-d59e2c43fade. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:10 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.999 183134 DEBUG oslo_concurrency.lockutils [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:10.999 183134 DEBUG oslo_concurrency.lockutils [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.000 183134 DEBUG nova.network.neutron [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Refreshing network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.000 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.003 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:c6:8c 10.100.0.5 2001:db8::f816:3eff:fe74:c68c'], port_security=['fa:16:3e:74:c6:8c 10.100.0.5 2001:db8::f816:3eff:fe74:c68c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8::f816:3eff:fe74:c68c/64', 'neutron:device_id': '11292039-d151-44b8-87a9-a58bbc82deaa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-175868ce-3812-409c-871e-725dea7b3f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a66ac63e-87fa-44f1-8a47-4e4a2f3b85c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12e8e746-8e5c-4e29-a519-42408cc3b2d9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=72592aff-3a8e-4d04-a6b8-d59e2c43fade) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.004 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 72592aff-3a8e-4d04-a6b8-d59e2c43fade in datapath 175868ce-3812-409c-871e-725dea7b3f30 unbound from our chassis#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.006 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 175868ce-3812-409c-871e-725dea7b3f30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.006 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b01aa0f6-24b2-4ab5-a245-96ff099cba51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.007 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-175868ce-3812-409c-871e-725dea7b3f30 namespace which is not needed anymore#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.030 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.031 183134 DEBUG nova.virt.disk.api [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.031 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:11 np0005601977 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 30 04:26:11 np0005601977 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 13.490s CPU time.
Jan 30 04:26:11 np0005601977 systemd-machined[154431]: Machine qemu-4-instance-00000009 terminated.
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.093 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.094 183134 DEBUG nova.virt.disk.api [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.095 183134 DEBUG nova.objects.instance [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 37aaa571-2821-4d88-b360-9f7b02c1aa1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [NOTICE]   (213208) : haproxy version is 2.8.14-c23fe91
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [NOTICE]   (213208) : path to executable is /usr/sbin/haproxy
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [WARNING]  (213208) : Exiting Master process...
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [WARNING]  (213208) : Exiting Master process...
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [ALERT]    (213208) : Current worker (213210) exited with code 143 (Terminated)
Jan 30 04:26:11 np0005601977 neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30[213204]: [WARNING]  (213208) : All workers exited. Exiting... (0)
Jan 30 04:26:11 np0005601977 systemd[1]: libpod-f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd.scope: Deactivated successfully.
Jan 30 04:26:11 np0005601977 podman[213649]: 2026-01-30 09:26:11.111417155 +0000 UTC m=+0.039816344 container died f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.116 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.116 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Ensure instance console log exists: /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.117 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.117 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.117 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:11 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd-userdata-shm.mount: Deactivated successfully.
Jan 30 04:26:11 np0005601977 systemd[1]: var-lib-containers-storage-overlay-7e43944d7c88807a591b646492688e4d2c47062cbef55e484f17fefb4d523aa8-merged.mount: Deactivated successfully.
Jan 30 04:26:11 np0005601977 podman[213649]: 2026-01-30 09:26:11.15481628 +0000 UTC m=+0.083215459 container cleanup f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:26:11 np0005601977 systemd[1]: libpod-conmon-f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd.scope: Deactivated successfully.
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.190 183134 INFO nova.virt.libvirt.driver [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Instance destroyed successfully.#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.191 183134 DEBUG nova.objects.instance [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 11292039-d151-44b8-87a9-a58bbc82deaa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:11 np0005601977 podman[213684]: 2026-01-30 09:26:11.202407275 +0000 UTC m=+0.035351035 container remove f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.205 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[97eae6c0-f9a7-4171-9311-7573ec796eb7]: (4, ('Fri Jan 30 09:26:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30 (f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd)\nf6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd\nFri Jan 30 09:26:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-175868ce-3812-409c-871e-725dea7b3f30 (f6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd)\nf6bd3bfcd4e8f9ba8e3738b96c0b5d15de8723e45f0d8801529a0960ffc59efd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.207 183134 DEBUG nova.virt.libvirt.vif [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:25:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2130032467',display_name='tempest-TestGettingAddress-server-2130032467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2130032467',id=9,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL+BySKS7cSsEJM1gGpSuc/gl0kYfJFv54Hi5jUq0ai9Z4VaR6LjUWtPODzZlmSBuVVJSx3dL7rZ83jUQgropZ6wwOTftLSgNiaOJW3HXwFNeQZ0eUotqoI6Bi1PJ9yeCw==',key_name='tempest-TestGettingAddress-1319690601',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:25:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-fklua2y8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:25:42Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=11292039-d151-44b8-87a9-a58bbc82deaa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.206 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8d57485b-dc5e-490f-98fa-a2b86c8dfdab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.207 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap175868ce-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.208 183134 DEBUG nova.network.os_vif_util [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.208 183134 DEBUG nova.network.os_vif_util [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.209 183134 DEBUG os_vif [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.211 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.213 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72592aff-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.214 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.214 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 kernel: tap175868ce-30: left promiscuous mode
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.215 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.219 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.220 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.222 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[94916024-c717-49d9-9ca2-c367813fff02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.223 183134 INFO os_vif [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:c6:8c,bridge_name='br-int',has_traffic_filtering=True,id=72592aff-3a8e-4d04-a6b8-d59e2c43fade,network=Network(175868ce-3812-409c-871e-725dea7b3f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72592aff-3a')#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.223 183134 INFO nova.virt.libvirt.driver [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Deleting instance files /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa_del#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.224 183134 INFO nova.virt.libvirt.driver [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Deletion of /var/lib/nova/instances/11292039-d151-44b8-87a9-a58bbc82deaa_del complete#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.234 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ad713e-32b7-4b24-9557-91228f3af3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.235 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ac419cd4-3e9c-4d76-91eb-00e56f90debe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.247 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3fbf44-c62f-40c7-a7da-8c4d487ed31c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366701, 'reachable_time': 31485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213716, 'error': None, 'target': 'ovnmeta-175868ce-3812-409c-871e-725dea7b3f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.249 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-175868ce-3812-409c-871e-725dea7b3f30 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:26:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:11.249 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[196bd9bc-03b8-4799-b1f1-498c44975c7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:11 np0005601977 systemd[1]: run-netns-ovnmeta\x2d175868ce\x2d3812\x2d409c\x2d871e\x2d725dea7b3f30.mount: Deactivated successfully.
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.295 183134 INFO nova.compute.manager [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.296 183134 DEBUG oslo.service.loopingcall [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.296 183134 DEBUG nova.compute.manager [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:26:11 np0005601977 nova_compute[183130]: 2026-01-30 09:26:11.297 183134 DEBUG nova.network.neutron [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:26:13 np0005601977 nova_compute[183130]: 2026-01-30 09:26:13.317 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:13 np0005601977 nova_compute[183130]: 2026-01-30 09:26:13.977 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Successfully created port: 6a96c970-8213-4137-b6a7-4c31f1488ad5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.027 183134 DEBUG nova.network.neutron [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updated VIF entry in instance network info cache for port 72592aff-3a8e-4d04-a6b8-d59e2c43fade. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.028 183134 DEBUG nova.network.neutron [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [{"id": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "address": "fa:16:3e:74:c6:8c", "network": {"id": "175868ce-3812-409c-871e-725dea7b3f30", "bridge": "br-int", "label": "tempest-network-smoke--1143654337", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c68c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72592aff-3a", "ovs_interfaceid": "72592aff-3a8e-4d04-a6b8-d59e2c43fade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.056 183134 DEBUG oslo_concurrency.lockutils [req-3e6834ab-2f65-4b76-b384-829e601ac1ae req-3e481299-01e7-40e2-995b-04080da27244 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-11292039-d151-44b8-87a9-a58bbc82deaa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.457 183134 DEBUG nova.compute.manager [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-unplugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.457 183134 DEBUG oslo_concurrency.lockutils [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.457 183134 DEBUG oslo_concurrency.lockutils [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.458 183134 DEBUG oslo_concurrency.lockutils [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.458 183134 DEBUG nova.compute.manager [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] No waiting events found dispatching network-vif-unplugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.458 183134 DEBUG nova.compute.manager [req-8bde92eb-103e-4bd0-915d-5a92b223817e req-937c9713-083d-4e6d-98d4-2fa0aa0dfb3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-unplugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.507 183134 DEBUG nova.network.neutron [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.532 183134 INFO nova.compute.manager [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Took 3.24 seconds to deallocate network for instance.#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.594 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.595 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.697 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.706 183134 DEBUG nova.compute.provider_tree [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.750 183134 DEBUG nova.scheduler.client.report [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.787 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:14 np0005601977 podman[213717]: 2026-01-30 09:26:14.823796487 +0000 UTC m=+0.043261402 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 30 04:26:14 np0005601977 podman[213718]: 2026-01-30 09:26:14.823823368 +0000 UTC m=+0.040715999 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.824 183134 INFO nova.scheduler.client.report [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 11292039-d151-44b8-87a9-a58bbc82deaa#033[00m
Jan 30 04:26:14 np0005601977 nova_compute[183130]: 2026-01-30 09:26:14.901 183134 DEBUG oslo_concurrency.lockutils [None req-332869f6-c9e4-41cd-992e-f115baa508be 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:16 np0005601977 nova_compute[183130]: 2026-01-30 09:26:16.213 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:17 np0005601977 nova_compute[183130]: 2026-01-30 09:26:17.391 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Successfully updated port: 6a96c970-8213-4137-b6a7-4c31f1488ad5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:26:17 np0005601977 nova_compute[183130]: 2026-01-30 09:26:17.412 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:17 np0005601977 nova_compute[183130]: 2026-01-30 09:26:17.413 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:17 np0005601977 nova_compute[183130]: 2026-01-30 09:26:17.413 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:26:18 np0005601977 nova_compute[183130]: 2026-01-30 09:26:18.270 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:26:18 np0005601977 nova_compute[183130]: 2026-01-30 09:26:18.569 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:19 np0005601977 nova_compute[183130]: 2026-01-30 09:26:19.708 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.317 183134 DEBUG nova.network.neutron [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.344 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.345 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance network_info: |[{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.348 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start _get_guest_xml network_info=[{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.351 183134 WARNING nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.356 183134 DEBUG nova.virt.libvirt.host [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.356 183134 DEBUG nova.virt.libvirt.host [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.359 183134 DEBUG nova.virt.libvirt.host [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.360 183134 DEBUG nova.virt.libvirt.host [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.361 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.361 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.362 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.362 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.363 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.363 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.363 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.363 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.364 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.364 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.364 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.365 183134 DEBUG nova.virt.hardware [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.371 183134 DEBUG nova.virt.libvirt.vif [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:26:10Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.371 183134 DEBUG nova.network.os_vif_util [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.372 183134 DEBUG nova.network.os_vif_util [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.373 183134 DEBUG nova.objects.instance [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 37aaa571-2821-4d88-b360-9f7b02c1aa1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.394 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <uuid>37aaa571-2821-4d88-b360-9f7b02c1aa1b</uuid>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <name>instance-0000000c</name>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-809581554</nova:name>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:26:20</nova:creationTime>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        <nova:port uuid="6a96c970-8213-4137-b6a7-4c31f1488ad5">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="serial">37aaa571-2821-4d88-b360-9f7b02c1aa1b</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="uuid">37aaa571-2821-4d88-b360-9f7b02c1aa1b</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:54:7d:f1"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <target dev="tap6a96c970-82"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/console.log" append="off"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:26:20 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:26:20 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:26:20 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:26:20 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.396 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Preparing to wait for external event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.396 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.397 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.397 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.398 183134 DEBUG nova.virt.libvirt.vif [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:26:10Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.399 183134 DEBUG nova.network.os_vif_util [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.400 183134 DEBUG nova.network.os_vif_util [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.400 183134 DEBUG os_vif [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.401 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.402 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.402 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.404 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.405 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a96c970-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.405 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a96c970-82, col_values=(('external_ids', {'iface-id': '6a96c970-8213-4137-b6a7-4c31f1488ad5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:7d:f1', 'vm-uuid': '37aaa571-2821-4d88-b360-9f7b02c1aa1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.407 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:20 np0005601977 NetworkManager[55565]: <info>  [1769765180.4085] manager: (tap6a96c970-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.410 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.411 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.412 183134 INFO os_vif [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82')#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.538 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.538 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.538 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:54:7d:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:26:20 np0005601977 nova_compute[183130]: 2026-01-30 09:26:20.539 183134 INFO nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Using config drive#033[00m
Jan 30 04:26:20 np0005601977 podman[213765]: 2026-01-30 09:26:20.543961887 +0000 UTC m=+0.098651271 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.147 183134 INFO nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Creating config drive at /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.151 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqom6tlwk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.269 183134 DEBUG oslo_concurrency.processutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqom6tlwk" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:22 np0005601977 kernel: tap6a96c970-82: entered promiscuous mode
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.3194] manager: (tap6a96c970-82): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.321 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:22Z|00074|binding|INFO|Claiming lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 for this chassis.
Jan 30 04:26:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:22Z|00075|binding|INFO|6a96c970-8213-4137-b6a7-4c31f1488ad5: Claiming fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.324 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:22Z|00076|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 ovn-installed in OVS
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.329 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.331 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 systemd-udevd[213810]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:26:22 np0005601977 systemd-machined[154431]: New machine qemu-6-instance-0000000c.
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.3680] device (tap6a96c970-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.3689] device (tap6a96c970-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:26:22 np0005601977 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Jan 30 04:26:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:22Z|00077|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 up in Southbound
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.405 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.406 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 bound to our chassis#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.409 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf368f4-a96c-4392-8db3-50f404160fc3#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.418 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[38e28405-143d-4c03-865c-6e651b669f32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.420 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf368f4-a1 in ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.422 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf368f4-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.422 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bf070e-841a-4184-83a2-5a3546a61008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.423 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0d3a1190-5aca-4335-ab8e-d9aa031ce24b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.437 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[e8455ed4-f792-402a-bbb4-106df52f50dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.466 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb89203-6fe7-4d4c-910a-be9e6982257a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.488 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[4952c440-020d-4ac9-8a0c-869107bce063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.4955] manager: (tap9cf368f4-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Jan 30 04:26:22 np0005601977 systemd-udevd[213812]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.496 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f3adc49a-3cb0-493c-aac9-d348a70c5ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.528 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[566399a5-d959-4fc1-82d8-479d85d57af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.531 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[fd97f292-326e-40b0-a854-8089c6415cba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.5485] device (tap9cf368f4-a0): carrier: link connected
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.552 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[58e78eb9-b1e6-40d0-9af5-8832baaa3c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.565 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cc02d22a-48dc-4245-802a-2ed739945cdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf368f4-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:13:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371388, 'reachable_time': 33446, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213843, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.582 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a665bd21-e47d-418a-944f-9f28b5a5219c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:13e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371388, 'tstamp': 371388}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213844, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.598 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbc8e3c-2496-4b1b-b3ef-23635f4c0cd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf368f4-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:13:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371388, 'reachable_time': 33446, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213845, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.627 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e9b408-44ff-479e-8345-3fb89603a0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.668 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff54133-faca-455c-a470-794260d850be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.669 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf368f4-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.669 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.670 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf368f4-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.671 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 kernel: tap9cf368f4-a0: entered promiscuous mode
Jan 30 04:26:22 np0005601977 NetworkManager[55565]: <info>  [1769765182.6724] manager: (tap9cf368f4-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.673 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf368f4-a0, col_values=(('external_ids', {'iface-id': '0c4430cb-14c7-41cc-8074-5659c58e2db6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.674 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:22Z|00078|binding|INFO|Releasing lport 0c4430cb-14c7-41cc-8074-5659c58e2db6 from this chassis (sb_readonly=0)
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.678 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.679 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.679 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.679 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[62ae563c-0066-443b-a12a-1a25ee934444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.680 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-9cf368f4-a96c-4392-8db3-50f404160fc3
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 9cf368f4-a96c-4392-8db3-50f404160fc3
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:26:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:22.680 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'env', 'PROCESS_TAG=haproxy-9cf368f4-a96c-4392-8db3-50f404160fc3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf368f4-a96c-4392-8db3-50f404160fc3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:26:22 np0005601977 podman[213882]: 2026-01-30 09:26:22.985232 +0000 UTC m=+0.047177125 container create 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.988 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765182.988003, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:22 np0005601977 nova_compute[183130]: 2026-01-30 09:26:22.989 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Started (Lifecycle Event)#033[00m
Jan 30 04:26:23 np0005601977 systemd[1]: Started libpod-conmon-91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e.scope.
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.017 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.021 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765182.9881184, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.022 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:26:23 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:26:23 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1480bc46094f81bcfadddda754a890c4328c9831ae71e0bfbf2879690fb377c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:26:23 np0005601977 podman[213882]: 2026-01-30 09:26:23.052631594 +0000 UTC m=+0.114576779 container init 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.054 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:23 np0005601977 podman[213882]: 2026-01-30 09:26:22.961513989 +0000 UTC m=+0.023459114 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:26:23 np0005601977 podman[213882]: 2026-01-30 09:26:23.056825954 +0000 UTC m=+0.118771089 container start 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.056 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.080 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:26:23 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [NOTICE]   (213900) : New worker (213902) forked
Jan 30 04:26:23 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [NOTICE]   (213900) : Loading success.
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.703 183134 DEBUG nova.compute.manager [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.703 183134 DEBUG oslo_concurrency.lockutils [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.704 183134 DEBUG oslo_concurrency.lockutils [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.705 183134 DEBUG oslo_concurrency.lockutils [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "11292039-d151-44b8-87a9-a58bbc82deaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.705 183134 DEBUG nova.compute.manager [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] No waiting events found dispatching network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.706 183134 WARNING nova.compute.manager [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received unexpected event network-vif-plugged-72592aff-3a8e-4d04-a6b8-d59e2c43fade for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.706 183134 DEBUG nova.compute.manager [req-f7f67544-70d9-421d-b48b-2962b6e65d11 req-1b1375e8-edcc-4aef-ac32-afa083b93a4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Received event network-vif-deleted-72592aff-3a8e-4d04-a6b8-d59e2c43fade external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.840 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.840 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.858 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.975 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.976 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.984 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:26:23 np0005601977 nova_compute[183130]: 2026-01-30 09:26:23.984 183134 INFO nova.compute.claims [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.208 183134 DEBUG nova.compute.provider_tree [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.231 183134 DEBUG nova.scheduler.client.report [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.254 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.255 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.321 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.322 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.363 183134 INFO nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.390 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.524 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.527 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.528 183134 INFO nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Creating image(s)#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.529 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.529 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.531 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.561 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.576 183134 DEBUG nova.policy [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.610 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.611 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.612 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.634 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.680 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.681 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.705 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.706 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.707 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.720 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.776 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.777 183134 DEBUG nova.virt.disk.api [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.777 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.828 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.834 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.835 183134 DEBUG nova.virt.disk.api [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.835 183134 DEBUG nova.objects.instance [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid 0b877545-56fc-40ba-b8dc-bae466bb2064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.880 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.881 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Ensure instance console log exists: /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.882 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.882 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:24 np0005601977 nova_compute[183130]: 2026-01-30 09:26:24.882 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.407 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.525 183134 DEBUG nova.compute.manager [req-c4b6a088-cb64-4629-ba5c-4c1eb1c04377 req-23e59047-842f-4129-9ba4-a63a5c649fb2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.525 183134 DEBUG oslo_concurrency.lockutils [req-c4b6a088-cb64-4629-ba5c-4c1eb1c04377 req-23e59047-842f-4129-9ba4-a63a5c649fb2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.526 183134 DEBUG oslo_concurrency.lockutils [req-c4b6a088-cb64-4629-ba5c-4c1eb1c04377 req-23e59047-842f-4129-9ba4-a63a5c649fb2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.526 183134 DEBUG oslo_concurrency.lockutils [req-c4b6a088-cb64-4629-ba5c-4c1eb1c04377 req-23e59047-842f-4129-9ba4-a63a5c649fb2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.526 183134 DEBUG nova.compute.manager [req-c4b6a088-cb64-4629-ba5c-4c1eb1c04377 req-23e59047-842f-4129-9ba4-a63a5c649fb2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Processing event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.528 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.532 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765185.5321434, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.532 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.535 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.538 183134 INFO nova.virt.libvirt.driver [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance spawned successfully.#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.539 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.713 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.720 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.724 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.725 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.725 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.726 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.726 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.727 183134 DEBUG nova.virt.libvirt.driver [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.835 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:26:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:25.940 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:26:25 np0005601977 nova_compute[183130]: 2026-01-30 09:26:25.941 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:25.941 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.085 183134 INFO nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Took 15.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.086 183134 DEBUG nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.188 183134 INFO nova.compute.manager [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Took 15.99 seconds to build instance.#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.189 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765171.1881444, 11292039-d151-44b8-87a9-a58bbc82deaa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.190 183134 INFO nova.compute.manager [-] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.213 183134 DEBUG nova.compute.manager [None req-2aff26f0-0ac7-44a3-a80f-81b6fbb44afe - - - - - -] [instance: 11292039-d151-44b8-87a9-a58bbc82deaa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:26 np0005601977 nova_compute[183130]: 2026-01-30 09:26:26.218 183134 DEBUG oslo_concurrency.lockutils [None req-b8855c9e-e4e2-4f9c-af98-bb07ebfb62a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:27 np0005601977 nova_compute[183130]: 2026-01-30 09:26:27.040 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Successfully created port: 94783e16-931b-49a5-9848-2b5c6206ac8a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:26:27 np0005601977 podman[213926]: 2026-01-30 09:26:27.828497789 +0000 UTC m=+0.046955538 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:26:28 np0005601977 nova_compute[183130]: 2026-01-30 09:26:28.141 183134 DEBUG nova.compute.manager [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:28 np0005601977 nova_compute[183130]: 2026-01-30 09:26:28.141 183134 DEBUG nova.compute.manager [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing instance network info cache due to event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:28 np0005601977 nova_compute[183130]: 2026-01-30 09:26:28.142 183134 DEBUG oslo_concurrency.lockutils [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:28 np0005601977 nova_compute[183130]: 2026-01-30 09:26:28.142 183134 DEBUG oslo_concurrency.lockutils [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:28 np0005601977 nova_compute[183130]: 2026-01-30 09:26:28.142 183134 DEBUG nova.network.neutron [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.129 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Successfully updated port: 94783e16-931b-49a5-9848-2b5c6206ac8a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.159 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.160 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.160 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.726 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:29 np0005601977 nova_compute[183130]: 2026-01-30 09:26:29.739 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:26:30 np0005601977 nova_compute[183130]: 2026-01-30 09:26:30.409 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:30 np0005601977 nova_compute[183130]: 2026-01-30 09:26:30.949 183134 DEBUG nova.network.neutron [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updated VIF entry in instance network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:30 np0005601977 nova_compute[183130]: 2026-01-30 09:26:30.949 183134 DEBUG nova.network.neutron [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:30 np0005601977 nova_compute[183130]: 2026-01-30 09:26:30.977 183134 DEBUG oslo_concurrency.lockutils [req-9a2f261f-e292-4dc9-98c6-af288413b33b req-9affa582-8f9e-463b-9d31-5ed6430d58e7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.040 183134 DEBUG nova.compute.manager [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.041 183134 DEBUG oslo_concurrency.lockutils [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.042 183134 DEBUG oslo_concurrency.lockutils [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.042 183134 DEBUG oslo_concurrency.lockutils [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.042 183134 DEBUG nova.compute.manager [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.043 183134 WARNING nova.compute.manager [req-70673e26-b3f4-44be-b23e-dae27cd5c960 req-1ade967f-2bdb-431b-a4e8-ee6b43ed4395 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.597 183134 DEBUG nova.network.neutron [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updating instance_info_cache with network_info: [{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.620 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.620 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Instance network_info: |[{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.622 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Start _get_guest_xml network_info=[{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.626 183134 WARNING nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.630 183134 DEBUG nova.virt.libvirt.host [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.631 183134 DEBUG nova.virt.libvirt.host [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.635 183134 DEBUG nova.virt.libvirt.host [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.635 183134 DEBUG nova.virt.libvirt.host [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.636 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.637 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.637 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.637 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.637 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.638 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.638 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.638 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.638 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.639 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.639 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.639 183134 DEBUG nova.virt.hardware [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.642 183134 DEBUG nova.virt.libvirt.vif [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=13,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-dze3j61d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:26:24Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=0b877545-56fc-40ba-b8dc-bae466bb2064,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.643 183134 DEBUG nova.network.os_vif_util [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.643 183134 DEBUG nova.network.os_vif_util [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.644 183134 DEBUG nova.objects.instance [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b877545-56fc-40ba-b8dc-bae466bb2064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.659 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <uuid>0b877545-56fc-40ba-b8dc-bae466bb2064</uuid>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <name>instance-0000000d</name>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063</nova:name>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:26:31</nova:creationTime>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        <nova:port uuid="94783e16-931b-49a5-9848-2b5c6206ac8a">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="serial">0b877545-56fc-40ba-b8dc-bae466bb2064</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="uuid">0b877545-56fc-40ba-b8dc-bae466bb2064</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.config"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:87:bf:ec"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <target dev="tap94783e16-93"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/console.log" append="off"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:26:31 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:26:31 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:26:31 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:26:31 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.660 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Preparing to wait for external event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.660 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.661 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.661 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.661 183134 DEBUG nova.virt.libvirt.vif [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=13,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-dze3j61d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:26:24Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=0b877545-56fc-40ba-b8dc-bae466bb2064,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.662 183134 DEBUG nova.network.os_vif_util [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.662 183134 DEBUG nova.network.os_vif_util [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.662 183134 DEBUG os_vif [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.663 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.663 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.663 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.666 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.667 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94783e16-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.667 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap94783e16-93, col_values=(('external_ids', {'iface-id': '94783e16-931b-49a5-9848-2b5c6206ac8a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:bf:ec', 'vm-uuid': '0b877545-56fc-40ba-b8dc-bae466bb2064'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.669 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.671 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:26:31 np0005601977 NetworkManager[55565]: <info>  [1769765191.6717] manager: (tap94783e16-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.675 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.676 183134 INFO os_vif [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93')#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.740 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.740 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.740 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:87:bf:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:26:31 np0005601977 nova_compute[183130]: 2026-01-30 09:26:31.741 183134 INFO nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Using config drive#033[00m
Jan 30 04:26:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:31.944 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.510 183134 INFO nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Creating config drive at /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.config#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.514 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpadxf90fb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.636 183134 DEBUG oslo_concurrency.processutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpadxf90fb" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:32 np0005601977 kernel: tap94783e16-93: entered promiscuous mode
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.6714] manager: (tap94783e16-93): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 30 04:26:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:32Z|00079|binding|INFO|Claiming lport 94783e16-931b-49a5-9848-2b5c6206ac8a for this chassis.
Jan 30 04:26:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:32Z|00080|binding|INFO|94783e16-931b-49a5-9848-2b5c6206ac8a: Claiming fa:16:3e:87:bf:ec 10.100.0.12
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.677 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.679 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:bf:ec 10.100.0.12'], port_security=['fa:16:3e:87:bf:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0b877545-56fc-40ba-b8dc-bae466bb2064', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95f0b6cb-f834-48b3-a422-9f55b7068495 eecfbcd8-2f91-46a3-95ca-ae6e61909029', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5402d645-fdcd-44ae-9cdd-5c60cb019856, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=94783e16-931b-49a5-9848-2b5c6206ac8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:26:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:32Z|00081|binding|INFO|Setting lport 94783e16-931b-49a5-9848-2b5c6206ac8a ovn-installed in OVS
Jan 30 04:26:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:32Z|00082|binding|INFO|Setting lport 94783e16-931b-49a5-9848-2b5c6206ac8a up in Southbound
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.682 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 94783e16-931b-49a5-9848-2b5c6206ac8a in datapath 4b6635c4-cf50-4be3-bead-3fd5f833ac92 bound to our chassis#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.683 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.685 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b6635c4-cf50-4be3-bead-3fd5f833ac92#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.692 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6a184b-664e-4841-948a-84e296d9069e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.693 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b6635c4-c1 in ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.695 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b6635c4-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.696 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4e413384-55c4-44a5-ac9b-04ffe048b981]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 systemd-udevd[213972]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.696 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[02dd984c-e4c0-4b0d-be38-c8aa81840d5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 systemd-machined[154431]: New machine qemu-7-instance-0000000d.
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.704 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[7d38fe05-6d5e-4e20-9e3a-71d8f39bfb68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.7113] device (tap94783e16-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:26:32 np0005601977 systemd[1]: Started Virtual Machine qemu-7-instance-0000000d.
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.7122] device (tap94783e16-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.714 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d347ddd1-90f5-4f9a-9723-21281aaaca5f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.741 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[700bc874-55ea-4176-8885-ac1e5abfbb36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.7475] manager: (tap4b6635c4-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.748 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[065fcf2e-c61e-4a12-a9ce-efb6720d2205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 systemd-udevd[213976]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.772 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5d2cc6-9f2f-4730-810a-40f100b47425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.775 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[6e537400-a736-4e70-abce-9f63a190bf08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.7930] device (tap4b6635c4-c0): carrier: link connected
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.795 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d6508147-aa28-43ba-882a-389a56c65744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.813 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f20e9a25-6056-44f5-9068-762cf8a693c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b6635c4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372413, 'reachable_time': 44602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214004, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.826 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[09823e16-e095-42c4-b52d-56892ee73d04]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:f66a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372413, 'tstamp': 372413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214005, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.839 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ceeca0bf-f600-4865-bed6-01207da8c1a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b6635c4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372413, 'reachable_time': 44602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214006, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.863 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[024df46f-ee31-4bfc-a7ef-c710fd3ac9e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.895 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[940e8d96-73c0-456d-beb8-fb99a4e9729c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.896 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6635c4-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.897 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.897 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b6635c4-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.898 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 NetworkManager[55565]: <info>  [1769765192.8994] manager: (tap4b6635c4-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 30 04:26:32 np0005601977 kernel: tap4b6635c4-c0: entered promiscuous mode
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.901 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.903 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b6635c4-c0, col_values=(('external_ids', {'iface-id': '50e65278-673c-4dc7-b450-b3f067941ab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.904 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:32Z|00083|binding|INFO|Releasing lport 50e65278-673c-4dc7-b450-b3f067941ab2 from this chassis (sb_readonly=0)
Jan 30 04:26:32 np0005601977 nova_compute[183130]: 2026-01-30 09:26:32.909 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.911 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b6635c4-cf50-4be3-bead-3fd5f833ac92.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b6635c4-cf50-4be3-bead-3fd5f833ac92.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.912 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee71339-27f1-4983-8771-4bb7ed6160c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.912 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-4b6635c4-cf50-4be3-bead-3fd5f833ac92
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/4b6635c4-cf50-4be3-bead-3fd5f833ac92.pid.haproxy
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 4b6635c4-cf50-4be3-bead-3fd5f833ac92
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:26:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:32.913 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'env', 'PROCESS_TAG=haproxy-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b6635c4-cf50-4be3-bead-3fd5f833ac92.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.034 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765193.0344708, 0b877545-56fc-40ba-b8dc-bae466bb2064 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.035 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] VM Started (Lifecycle Event)#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.054 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.058 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765193.0352116, 0b877545-56fc-40ba-b8dc-bae466bb2064 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.059 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.094 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.098 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.132 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:26:33 np0005601977 podman[214045]: 2026-01-30 09:26:33.240750142 +0000 UTC m=+0.047025530 container create 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 30 04:26:33 np0005601977 systemd[1]: Started libpod-conmon-5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48.scope.
Jan 30 04:26:33 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:26:33 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5144771f2afdfe1a79ccfba01f37f0bdc8fd174f88b73b51054ced42acf5cd3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:26:33 np0005601977 podman[214045]: 2026-01-30 09:26:33.292332263 +0000 UTC m=+0.098607671 container init 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 30 04:26:33 np0005601977 podman[214045]: 2026-01-30 09:26:33.296829832 +0000 UTC m=+0.103105220 container start 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:26:33 np0005601977 podman[214045]: 2026-01-30 09:26:33.223256071 +0000 UTC m=+0.029531479 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:26:33 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [NOTICE]   (214064) : New worker (214066) forked
Jan 30 04:26:33 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [NOTICE]   (214064) : Loading success.
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.612 183134 DEBUG nova.compute.manager [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.613 183134 DEBUG nova.compute.manager [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing instance network info cache due to event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.614 183134 DEBUG oslo_concurrency.lockutils [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.615 183134 DEBUG oslo_concurrency.lockutils [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:33 np0005601977 nova_compute[183130]: 2026-01-30 09:26:33.615 183134 DEBUG nova.network.neutron [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:34 np0005601977 nova_compute[183130]: 2026-01-30 09:26:34.731 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:35 np0005601977 nova_compute[183130]: 2026-01-30 09:26:35.039 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:35 np0005601977 nova_compute[183130]: 2026-01-30 09:26:35.579 183134 DEBUG nova.network.neutron [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updated VIF entry in instance network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:35 np0005601977 nova_compute[183130]: 2026-01-30 09:26:35.580 183134 DEBUG nova.network.neutron [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updating instance_info_cache with network_info: [{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:35 np0005601977 nova_compute[183130]: 2026-01-30 09:26:35.603 183134 DEBUG oslo_concurrency.lockutils [req-63b10a50-31cc-468c-bbb5-8ee0c4cfa986 req-57cc6c98-d6ba-42e4-8976-497b4f8e79b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:36 np0005601977 nova_compute[183130]: 2026-01-30 09:26:36.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:36Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:26:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:36Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:26:36 np0005601977 podman[214090]: 2026-01-30 09:26:36.844453457 +0000 UTC m=+0.056419280 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, release=1769056855, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:26:36 np0005601977 podman[214091]: 2026-01-30 09:26:36.851924121 +0000 UTC m=+0.060166137 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:26:38 np0005601977 nova_compute[183130]: 2026-01-30 09:26:38.479 183134 DEBUG nova.compute.manager [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:38 np0005601977 nova_compute[183130]: 2026-01-30 09:26:38.479 183134 DEBUG nova.compute.manager [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing instance network info cache due to event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:38 np0005601977 nova_compute[183130]: 2026-01-30 09:26:38.480 183134 DEBUG oslo_concurrency.lockutils [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:38 np0005601977 nova_compute[183130]: 2026-01-30 09:26:38.480 183134 DEBUG oslo_concurrency.lockutils [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:38 np0005601977 nova_compute[183130]: 2026-01-30 09:26:38.481 183134 DEBUG nova.network.neutron [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:39 np0005601977 nova_compute[183130]: 2026-01-30 09:26:39.731 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:41 np0005601977 nova_compute[183130]: 2026-01-30 09:26:41.297 183134 DEBUG nova.network.neutron [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updated VIF entry in instance network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:41 np0005601977 nova_compute[183130]: 2026-01-30 09:26:41.298 183134 DEBUG nova.network.neutron [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:41 np0005601977 nova_compute[183130]: 2026-01-30 09:26:41.330 183134 DEBUG oslo_concurrency.lockutils [req-b2ed1348-85a4-4d73-8732-d2bb40d019aa req-f0a67ac4-de57-44e2-b843-9340378e5e97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:41 np0005601977 nova_compute[183130]: 2026-01-30 09:26:41.689 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.585 183134 DEBUG nova.compute.manager [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.585 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.585 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.586 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.586 183134 DEBUG nova.compute.manager [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Processing event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.586 183134 DEBUG nova.compute.manager [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.586 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.587 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.587 183134 DEBUG oslo_concurrency.lockutils [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.587 183134 DEBUG nova.compute.manager [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] No waiting events found dispatching network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.587 183134 WARNING nova.compute.manager [req-ee2a7992-3cc1-415f-a481-a71db34533c3 req-ad2a37cf-f5ae-474f-be59-89fece307eb6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received unexpected event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.588 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.592 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765202.5917919, 0b877545-56fc-40ba-b8dc-bae466bb2064 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.592 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.594 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.597 183134 INFO nova.virt.libvirt.driver [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Instance spawned successfully.#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.598 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.614 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.621 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.623 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.624 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.624 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.624 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.625 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.625 183134 DEBUG nova.virt.libvirt.driver [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.650 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.688 183134 INFO nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Took 18.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.688 183134 DEBUG nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.745 183134 INFO nova.compute.manager [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Took 18.81 seconds to build instance.#033[00m
Jan 30 04:26:42 np0005601977 nova_compute[183130]: 2026-01-30 09:26:42.761 183134 DEBUG oslo_concurrency.lockutils [None req-ae65c6f4-998e-4e83-9914-bbee4e84ba56 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:44 np0005601977 nova_compute[183130]: 2026-01-30 09:26:44.108 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Check if temp file /var/lib/nova/instances/tmpxaxptqxz exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 30 04:26:44 np0005601977 nova_compute[183130]: 2026-01-30 09:26:44.108 183134 DEBUG nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxaxptqxz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9c98ea59-db8f-40da-830b-351a58e44561',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 30 04:26:44 np0005601977 nova_compute[183130]: 2026-01-30 09:26:44.733 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:45 np0005601977 podman[214133]: 2026-01-30 09:26:45.830976522 +0000 UTC m=+0.050770498 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:26:45 np0005601977 podman[214132]: 2026-01-30 09:26:45.863869456 +0000 UTC m=+0.084032902 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.365 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.417 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.418 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.430 183134 INFO nova.compute.manager [None req-90f85ab5-f7c0-492e-a67d-32ef8435719d 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Get console output#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.436 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.466 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:46 np0005601977 nova_compute[183130]: 2026-01-30 09:26:46.694 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:49 np0005601977 nova_compute[183130]: 2026-01-30 09:26:49.735 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:50 np0005601977 systemd-logind[809]: New session 27 of user nova.
Jan 30 04:26:50 np0005601977 systemd[1]: Created slice User Slice of UID 42436.
Jan 30 04:26:50 np0005601977 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 30 04:26:50 np0005601977 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 30 04:26:50 np0005601977 systemd[1]: Starting User Manager for UID 42436...
Jan 30 04:26:50 np0005601977 podman[214181]: 2026-01-30 09:26:50.909307456 +0000 UTC m=+0.116041381 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:26:50 np0005601977 systemd[214192]: Queued start job for default target Main User Target.
Jan 30 04:26:50 np0005601977 systemd[214192]: Created slice User Application Slice.
Jan 30 04:26:50 np0005601977 systemd[214192]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 30 04:26:50 np0005601977 systemd[214192]: Started Daily Cleanup of User's Temporary Directories.
Jan 30 04:26:50 np0005601977 systemd[214192]: Reached target Paths.
Jan 30 04:26:50 np0005601977 systemd[214192]: Reached target Timers.
Jan 30 04:26:50 np0005601977 systemd[214192]: Starting D-Bus User Message Bus Socket...
Jan 30 04:26:50 np0005601977 systemd[214192]: Starting Create User's Volatile Files and Directories...
Jan 30 04:26:50 np0005601977 systemd[214192]: Listening on D-Bus User Message Bus Socket.
Jan 30 04:26:50 np0005601977 systemd[214192]: Reached target Sockets.
Jan 30 04:26:50 np0005601977 systemd[214192]: Finished Create User's Volatile Files and Directories.
Jan 30 04:26:50 np0005601977 systemd[214192]: Reached target Basic System.
Jan 30 04:26:50 np0005601977 systemd[214192]: Reached target Main User Target.
Jan 30 04:26:50 np0005601977 systemd[214192]: Startup finished in 136ms.
Jan 30 04:26:50 np0005601977 systemd[1]: Started User Manager for UID 42436.
Jan 30 04:26:51 np0005601977 systemd[1]: Started Session 27 of User nova.
Jan 30 04:26:51 np0005601977 systemd-logind[809]: Session 27 logged out. Waiting for processes to exit.
Jan 30 04:26:51 np0005601977 systemd[1]: session-27.scope: Deactivated successfully.
Jan 30 04:26:51 np0005601977 systemd-logind[809]: Removed session 27.
Jan 30 04:26:51 np0005601977 nova_compute[183130]: 2026-01-30 09:26:51.697 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.479 183134 INFO nova.compute.manager [None req-7e64d3e7-98b4-4ee6-bdf6-116a805f60d7 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Get console output#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.486 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.915 183134 INFO nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Took 6.45 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.916 183134 DEBUG nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.939 183134 DEBUG nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxaxptqxz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9c98ea59-db8f-40da-830b-351a58e44561',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(773e579d-770a-48c5-a527-7dd90008a38b),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.963 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Post-copy flag unset because instance is paused. _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10184#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.964 183134 DEBUG nova.objects.instance [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c98ea59-db8f-40da-830b-351a58e44561 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.966 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.968 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.968 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.983 183134 DEBUG nova.virt.libvirt.vif [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:25:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1037943593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1037943593',id=10,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:25:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=3,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-o7sx85if',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:25:52Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=9c98ea59-db8f-40da-830b-351a58e44561,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.983 183134 DEBUG nova.network.os_vif_util [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converting VIF {"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.984 183134 DEBUG nova.network.os_vif_util [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.985 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating guest XML with vif config: <interface type="ethernet">
Jan 30 04:26:52 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:25:7d:54"/>
Jan 30 04:26:52 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:26:52 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:26:52 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:26:52 np0005601977 nova_compute[183130]:  <target dev="tap67ee4400-65"/>
Jan 30 04:26:52 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:26:52 np0005601977 nova_compute[183130]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 30 04:26:52 np0005601977 nova_compute[183130]: 2026-01-30 09:26:52.985 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.355 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.373 183134 DEBUG nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.374 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.375 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.376 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.376 183134 DEBUG nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.376 183134 DEBUG nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.377 183134 DEBUG nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.377 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.378 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.379 183134 DEBUG oslo_concurrency.lockutils [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.379 183134 DEBUG nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.379 183134 WARNING nova.compute.manager [req-331ed7eb-7b4c-494a-9af5-d6a07bd6eec4 req-3dfe66db-dd0b-47b5-b18a-668b55d1a279 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state paused and task_state migrating.#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.471 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.471 183134 INFO nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 30 04:26:53 np0005601977 nova_compute[183130]: 2026-01-30 09:26:53.557 183134 INFO nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.083 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.084 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 30 04:26:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:54Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:bf:ec 10.100.0.12
Jan 30 04:26:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:54Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:bf:ec 10.100.0.12
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.587 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.588 183134 DEBUG nova.virt.libvirt.migration [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 30 04:26:54 np0005601977 kernel: tap67ee4400-65 (unregistering): left promiscuous mode
Jan 30 04:26:54 np0005601977 NetworkManager[55565]: <info>  [1769765214.6129] device (tap67ee4400-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:26:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:54Z|00084|binding|INFO|Releasing lport 67ee4400-6557-46b1-b66a-75f59eee46ea from this chassis (sb_readonly=0)
Jan 30 04:26:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:54Z|00085|binding|INFO|Setting lport 67ee4400-6557-46b1-b66a-75f59eee46ea down in Southbound
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.632 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:54Z|00086|binding|INFO|Removing iface tap67ee4400-65 ovn-installed in OVS
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.642 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.642 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7d:54 10.100.0.7'], port_security=['fa:16:3e:25:7d:54 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9c98ea59-db8f-40da-830b-351a58e44561', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=67ee4400-6557-46b1-b66a-75f59eee46ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.645 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 67ee4400-6557-46b1-b66a-75f59eee46ea in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 unbound from our chassis#033[00m
Jan 30 04:26:54 np0005601977 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 30 04:26:54 np0005601977 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 8.561s CPU time.
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.652 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7#033[00m
Jan 30 04:26:54 np0005601977 systemd-machined[154431]: Machine qemu-5-instance-0000000a terminated.
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.665 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a767de6c-34eb-4a26-8871-569311081467]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.685 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[935a7b34-04a6-4b34-8bef-c1b560fceaa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.689 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[54879de4-fcc3-4f9c-bfe4-36bfc0c7976d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.710 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[07be8354-f317-48f2-b8bc-36f2f1b1a10b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.721 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab76e6b-a923-43e4-89fe-f24670c81820]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214256, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.731 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[78b6e23c-6985-43fc-9a59-bc8b0babda4b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366992, 'tstamp': 366992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366995, 'tstamp': 366995}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.732 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.735 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.738 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e3ea2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.738 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.738 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.739 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e3ea2-50, col_values=(('external_ids', {'iface-id': '15b4d9a6-bad1-4bf8-a262-02e27eb8ea93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:54.739 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.848 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.848 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 30 04:26:54 np0005601977 nova_compute[183130]: 2026-01-30 09:26:54.848 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.090 183134 DEBUG nova.virt.libvirt.guest [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '9c98ea59-db8f-40da-830b-351a58e44561' (instance-0000000a) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.091 183134 INFO nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migration operation has completed#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.091 183134 INFO nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] _post_live_migration() is started..#033[00m
Jan 30 04:26:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:55Z|00087|binding|INFO|Releasing lport 0c4430cb-14c7-41cc-8074-5659c58e2db6 from this chassis (sb_readonly=0)
Jan 30 04:26:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:55Z|00088|binding|INFO|Releasing lport 50e65278-673c-4dc7-b450-b3f067941ab2 from this chassis (sb_readonly=0)
Jan 30 04:26:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:26:55Z|00089|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.160 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.369 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.398 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.398 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.399 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.399 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.467 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.469 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing instance network info cache due to event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.469 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.470 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.470 183134 DEBUG nova.network.neutron [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.509 183134 DEBUG nova.compute.manager [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-changed-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.510 183134 DEBUG nova.compute.manager [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Refreshing instance network info cache due to event network-changed-67ee4400-6557-46b1-b66a-75f59eee46ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.511 183134 DEBUG oslo_concurrency.lockutils [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.511 183134 DEBUG oslo_concurrency.lockutils [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.512 183134 DEBUG nova.network.neutron [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Refreshing network info cache for port 67ee4400-6557-46b1-b66a-75f59eee46ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.525 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.599 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.600 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.680 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.688 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.775 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.776 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.825 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.836 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.890 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.891 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:55 np0005601977 nova_compute[183130]: 2026-01-30 09:26:55.945 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.113 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.114 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5142MB free_disk=73.27283477783203GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.115 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.115 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.177 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating resource usage from migration 88445283-129b-4f87-a75d-ab640dcf2a52#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.177 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating resource usage from migration 773e579d-770a-48c5-a527-7dd90008a38b#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.210 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.211 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration 773e579d-770a-48c5-a527-7dd90008a38b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.211 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 0b877545-56fc-40ba-b8dc-bae466bb2064 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.212 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration 88445283-129b-4f87-a75d-ab640dcf2a52 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.212 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.212 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.307 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.322 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.345 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.345 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.699 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.918 183134 DEBUG nova.network.neutron [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Activated binding for port 67ee4400-6557-46b1-b66a-75f59eee46ea and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.919 183134 DEBUG nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.921 183134 DEBUG nova.virt.libvirt.vif [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:25:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1037943593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1037943593',id=10,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:25:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=3,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-o7sx85if',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:25:55Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=9c98ea59-db8f-40da-830b-351a58e44561,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.921 183134 DEBUG nova.network.os_vif_util [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converting VIF {"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.922 183134 DEBUG nova.network.os_vif_util [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.923 183134 DEBUG os_vif [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.925 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.925 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67ee4400-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.927 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.930 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.932 183134 INFO os_vif [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7d:54,bridge_name='br-int',has_traffic_filtering=True,id=67ee4400-6557-46b1-b66a-75f59eee46ea,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67ee4400-65')#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.933 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.934 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.935 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.935 183134 DEBUG nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.936 183134 INFO nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Deleting instance files /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561_del#033[00m
Jan 30 04:26:56 np0005601977 nova_compute[183130]: 2026-01-30 09:26:56.937 183134 INFO nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Deletion of /var/lib/nova/instances/9c98ea59-db8f-40da-830b-351a58e44561_del complete#033[00m
Jan 30 04:26:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:57.379 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:57.379 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:26:57.380 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.482 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.483 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.483 183134 DEBUG nova.network.neutron [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.625 183134 DEBUG nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.625 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.626 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.626 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.626 183134 DEBUG nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.627 183134 WARNING nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state paused and task_state migrating.#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.627 183134 DEBUG nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.627 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.627 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.628 183134 DEBUG oslo_concurrency.lockutils [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.628 183134 DEBUG nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.628 183134 WARNING nova.compute.manager [req-6a089916-02c3-4eba-8808-e4a000e4d441 req-a59db40b-d45e-4e8c-b24a-bbc35e5b67f8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state paused and task_state migrating.#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.718 183134 DEBUG nova.compute.manager [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.719 183134 DEBUG oslo_concurrency.lockutils [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.720 183134 DEBUG oslo_concurrency.lockutils [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.720 183134 DEBUG oslo_concurrency.lockutils [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.720 183134 DEBUG nova.compute.manager [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:57 np0005601977 nova_compute[183130]: 2026-01-30 09:26:57.721 183134 DEBUG nova.compute.manager [req-71df16eb-a978-4a64-b408-b68d3e37f4b4 req-8d4e1f86-da83-463e-9ee0-ef5a0cd197cd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.707 183134 DEBUG nova.network.neutron [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updated VIF entry in instance network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.708 183134 DEBUG nova.network.neutron [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updating instance_info_cache with network_info: [{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.735 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.736 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.736 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.736 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.737 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.737 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.737 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-unplugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.738 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.738 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.739 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.739 183134 DEBUG oslo_concurrency.lockutils [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.739 183134 DEBUG nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.740 183134 WARNING nova.compute.manager [req-88d29ff1-d965-40d8-8bfb-5ca97a15de18 req-a16a9d9e-d350-40ef-b73d-06033a1bd5bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state paused and task_state migrating.#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.775 183134 DEBUG nova.network.neutron [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updated VIF entry in instance network info cache for port 67ee4400-6557-46b1-b66a-75f59eee46ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.775 183134 DEBUG nova.network.neutron [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating instance_info_cache with network_info: [{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": null, "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {"bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:58 np0005601977 nova_compute[183130]: 2026-01-30 09:26:58.819 183134 DEBUG oslo_concurrency.lockutils [req-48918175-d71f-43e7-b77a-e3012f389115 req-06a31be0-efd1-477f-9684-323a287da564 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:58 np0005601977 podman[214295]: 2026-01-30 09:26:58.829470692 +0000 UTC m=+0.048415750 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.356 183134 DEBUG nova.network.neutron [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.373 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.502 183134 DEBUG nova.virt.libvirt.driver [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.502 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Creating file /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/053d8be10ed943208399174fa70b4af4.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.502 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/053d8be10ed943208399174fa70b4af4.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.742 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.956 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/053d8be10ed943208399174fa70b4af4.tmp" returned: 1 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.957 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/053d8be10ed943208399174fa70b4af4.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.957 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Creating directory /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 30 04:26:59 np0005601977 nova_compute[183130]: 2026-01-30 09:26:59.958 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.167 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.171 183134 DEBUG nova.virt.libvirt.driver [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.315 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.362 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.367 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.368 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.410 183134 DEBUG nova.compute.manager [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.411 183134 DEBUG oslo_concurrency.lockutils [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.411 183134 DEBUG oslo_concurrency.lockutils [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.412 183134 DEBUG oslo_concurrency.lockutils [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.412 183134 DEBUG nova.compute.manager [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] No waiting events found dispatching network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:00 np0005601977 nova_compute[183130]: 2026-01-30 09:27:00.412 183134 WARNING nova.compute.manager [req-f36fafff-40b8-4971-8387-6f5aadea99b7 req-707d221f-4a87-4412-b571-2faafe317ba1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Received unexpected event network-vif-plugged-67ee4400-6557-46b1-b66a-75f59eee46ea for instance with vm_state paused and task_state migrating.#033[00m
Jan 30 04:27:01 np0005601977 systemd[1]: Stopping User Manager for UID 42436...
Jan 30 04:27:01 np0005601977 systemd[214192]: Activating special unit Exit the Session...
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped target Main User Target.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped target Basic System.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped target Paths.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped target Sockets.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped target Timers.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 30 04:27:01 np0005601977 systemd[214192]: Closed D-Bus User Message Bus Socket.
Jan 30 04:27:01 np0005601977 systemd[214192]: Stopped Create User's Volatile Files and Directories.
Jan 30 04:27:01 np0005601977 systemd[214192]: Removed slice User Application Slice.
Jan 30 04:27:01 np0005601977 systemd[214192]: Reached target Shutdown.
Jan 30 04:27:01 np0005601977 systemd[214192]: Finished Exit the Session.
Jan 30 04:27:01 np0005601977 systemd[214192]: Reached target Exit the Session.
Jan 30 04:27:01 np0005601977 systemd[1]: user@42436.service: Deactivated successfully.
Jan 30 04:27:01 np0005601977 systemd[1]: Stopped User Manager for UID 42436.
Jan 30 04:27:01 np0005601977 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 30 04:27:01 np0005601977 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 30 04:27:01 np0005601977 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 30 04:27:01 np0005601977 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 30 04:27:01 np0005601977 systemd[1]: Removed slice User Slice of UID 42436.
Jan 30 04:27:01 np0005601977 nova_compute[183130]: 2026-01-30 09:27:01.927 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:27:02 np0005601977 kernel: tap6a96c970-82 (unregistering): left promiscuous mode
Jan 30 04:27:02 np0005601977 NetworkManager[55565]: <info>  [1769765222.3515] device (tap6a96c970-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00090|binding|INFO|Releasing lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 from this chassis (sb_readonly=0)
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.358 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00091|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 down in Southbound
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00092|binding|INFO|Removing iface tap6a96c970-82 ovn-installed in OVS
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.360 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.365 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.366 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.367 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 unbound from our chassis#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.369 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf368f4-a96c-4392-8db3-50f404160fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.370 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[384e931a-ed6a-4ade-b896-b824fe1f4c8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.371 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 namespace which is not needed anymore#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.377 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:27:02 np0005601977 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 30 04:27:02 np0005601977 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 12.760s CPU time.
Jan 30 04:27:02 np0005601977 systemd-machined[154431]: Machine qemu-6-instance-0000000c terminated.
Jan 30 04:27:02 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [NOTICE]   (213900) : haproxy version is 2.8.14-c23fe91
Jan 30 04:27:02 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [NOTICE]   (213900) : path to executable is /usr/sbin/haproxy
Jan 30 04:27:02 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [WARNING]  (213900) : Exiting Master process...
Jan 30 04:27:02 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [ALERT]    (213900) : Current worker (213902) exited with code 143 (Terminated)
Jan 30 04:27:02 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[213896]: [WARNING]  (213900) : All workers exited. Exiting... (0)
Jan 30 04:27:02 np0005601977 systemd[1]: libpod-91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e.scope: Deactivated successfully.
Jan 30 04:27:02 np0005601977 conmon[213896]: conmon 91e8ac10ab7e978b2699 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e.scope/container/memory.events
Jan 30 04:27:02 np0005601977 podman[214351]: 2026-01-30 09:27:02.468087019 +0000 UTC m=+0.036260021 container died 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:27:02 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e-userdata-shm.mount: Deactivated successfully.
Jan 30 04:27:02 np0005601977 systemd[1]: var-lib-containers-storage-overlay-e1480bc46094f81bcfadddda754a890c4328c9831ae71e0bfbf2879690fb377c-merged.mount: Deactivated successfully.
Jan 30 04:27:02 np0005601977 podman[214351]: 2026-01-30 09:27:02.511017931 +0000 UTC m=+0.079190933 container cleanup 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:27:02 np0005601977 systemd[1]: libpod-conmon-91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e.scope: Deactivated successfully.
Jan 30 04:27:02 np0005601977 kernel: tap6a96c970-82: entered promiscuous mode
Jan 30 04:27:02 np0005601977 kernel: tap6a96c970-82 (unregistering): left promiscuous mode
Jan 30 04:27:02 np0005601977 NetworkManager[55565]: <info>  [1769765222.5841] manager: (tap6a96c970-82): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00093|binding|INFO|Claiming lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 for this chassis.
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00094|binding|INFO|6a96c970-8213-4137-b6a7-4c31f1488ad5: Claiming fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00095|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 ovn-installed in OVS
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.633 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.635 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00096|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 up in Southbound
Jan 30 04:27:02 np0005601977 podman[214382]: 2026-01-30 09:27:02.658709858 +0000 UTC m=+0.135236601 container remove 91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.663 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[80e69888-5341-43d2-827a-ac8cb1bcc904]: (4, ('Fri Jan 30 09:27:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 (91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e)\n91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e\nFri Jan 30 09:27:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 (91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e)\n91e8ac10ab7e978b2699bfcafd08426d430fb0a3a595427a76ab10eacffcfa6e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.665 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd48415-c8d2-4ec4-aadf-56dddcae56c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.665 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf368f4-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.667 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 kernel: tap9cf368f4-a0: left promiscuous mode
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00097|binding|INFO|Releasing lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 from this chassis (sb_readonly=0)
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00098|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 down in Southbound
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.673 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:02Z|00099|binding|INFO|Removing iface tap6a96c970-82 ovn-installed in OVS
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.675 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.676 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7d0d5a-98ae-4a0e-8286-cc8728432497]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 nova_compute[183130]: 2026-01-30 09:27:02.681 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.683 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.698 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d15ba258-c1e0-4eee-b954-04ec3db604c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.700 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9317f435-69e5-4103-9b19-8442fee76811]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.711 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8e37d181-474e-4bb9-af92-bd55600c4a44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371382, 'reachable_time': 33555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214414, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.713 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.713 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[3d332ee7-b813-4645-8411-03291d35f6a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.714 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 unbound from our chassis#033[00m
Jan 30 04:27:02 np0005601977 systemd[1]: run-netns-ovnmeta\x2d9cf368f4\x2da96c\x2d4392\x2d8db3\x2d50f404160fc3.mount: Deactivated successfully.
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.717 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf368f4-a96c-4392-8db3-50f404160fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.717 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a55b46-24a4-4b89-a6ea-c2daa6f7fe66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.718 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 unbound from our chassis#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.719 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf368f4-a96c-4392-8db3-50f404160fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:27:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:02.720 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f571f6b2-0134-4ec7-9c99-b66a8470a81e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.189 183134 INFO nova.virt.libvirt.driver [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance shutdown successfully after 3 seconds.#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.196 183134 INFO nova.virt.libvirt.driver [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance destroyed successfully.#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.197 183134 DEBUG nova.virt.libvirt.vif [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:26:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:26:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--456235833", "vif_mac": "fa:16:3e:54:7d:f1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.198 183134 DEBUG nova.network.os_vif_util [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--456235833", "vif_mac": "fa:16:3e:54:7d:f1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.199 183134 DEBUG nova.network.os_vif_util [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.199 183134 DEBUG os_vif [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.200 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.201 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a96c970-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.202 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.204 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.208 183134 INFO os_vif [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82')#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.213 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.293 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.295 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.360 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.363 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk to 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.364 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.527 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "9c98ea59-db8f-40da-830b-351a58e44561-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.528 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.529 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "9c98ea59-db8f-40da-830b-351a58e44561-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.557 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.558 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.558 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.558 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.644 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.698 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.700 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.752 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.759 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.817 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.818 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.877 183134 DEBUG oslo_concurrency.processutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.883 183134 WARNING nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000c, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.947 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.948 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:27:03 np0005601977 nova_compute[183130]: 2026-01-30 09:27:03.948 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.config 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.042 183134 WARNING nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.043 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5308MB free_disk=73.2735595703125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.044 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.044 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.165 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -C -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.config 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.167 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Copying file /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.167 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.info 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.188 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Migration for instance 9c98ea59-db8f-40da-830b-351a58e44561 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.215 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.216 183134 INFO nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating resource usage from migration 88445283-129b-4f87-a75d-ab640dcf2a52#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.247 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.248 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Migration 773e579d-770a-48c5-a527-7dd90008a38b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.248 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Instance 0b877545-56fc-40ba-b8dc-bae466bb2064 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.248 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Migration 88445283-129b-4f87-a75d-ab640dcf2a52 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.249 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.249 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.358 183134 DEBUG nova.compute.provider_tree [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.371 183134 DEBUG nova.scheduler.client.report [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.377 183134 DEBUG oslo_concurrency.processutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] CMD "scp -C -r /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_resize/disk.info 192.168.122.102:/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.398 183134 DEBUG nova.compute.resource_tracker [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.399 183134 DEBUG oslo_concurrency.lockutils [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.404 183134 INFO nova.compute.manager [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.464 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updating instance_info_cache with network_info: [{"id": "67ee4400-6557-46b1-b66a-75f59eee46ea", "address": "fa:16:3e:25:7d:54", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67ee4400-65", "ovs_interfaceid": "67ee4400-6557-46b1-b66a-75f59eee46ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.488 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-9c98ea59-db8f-40da-830b-351a58e44561" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.488 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.488 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.489 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.489 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.489 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.489 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.490 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.507 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.530 183134 INFO nova.scheduler.client.report [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Deleted allocation for migration 773e579d-770a-48c5-a527-7dd90008a38b#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.530 183134 DEBUG nova.virt.libvirt.driver [None req-2277cc08-7431-4cd2-a4f3-2d23b3090fd7 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.537 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.538 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.558 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.622 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.622 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.628 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.629 183134 INFO nova.compute.claims [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.631 183134 DEBUG neutronclient.v2_0.client [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.746 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.751 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.751 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.751 183134 DEBUG oslo_concurrency.lockutils [None req-ab64473d-44e6-45c0-96b3-ada9ea9d8a99 9fd29364750549149196ccb9cea4a1a5 f22b8508a5bd4aaa92485e329d8a341a - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.793 183134 DEBUG nova.compute.provider_tree [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.807 183134 DEBUG nova.scheduler.client.report [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.829 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.830 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.878 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.878 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.900 183134 INFO nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:27:04 np0005601977 nova_compute[183130]: 2026-01-30 09:27:04.919 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.007 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.009 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.010 183134 INFO nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Creating image(s)#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.011 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.011 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.012 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.031 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.086 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.087 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.087 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.097 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.140 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.141 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.174 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.175 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.176 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.230 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.231 183134 DEBUG nova.virt.disk.api [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.231 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.296 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.297 183134 DEBUG nova.virt.disk.api [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.297 183134 DEBUG nova.objects.instance [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid cff69cb0-f990-4189-91a5-729a7dcfe813 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.316 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.317 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Ensure instance console log exists: /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.318 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.318 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.318 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.398 183134 DEBUG nova.compute.manager [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.399 183134 DEBUG oslo_concurrency.lockutils [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.399 183134 DEBUG oslo_concurrency.lockutils [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.399 183134 DEBUG oslo_concurrency.lockutils [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.400 183134 DEBUG nova.compute.manager [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.400 183134 WARNING nova.compute.manager [req-beb749a9-b72a-4142-98c9-2ebd0f41192f req-bad47fcf-aa6c-4c54-bb0c-60cef78289cc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:05 np0005601977 nova_compute[183130]: 2026-01-30 09:27:05.404 183134 DEBUG nova.policy [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:27:06 np0005601977 nova_compute[183130]: 2026-01-30 09:27:06.362 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:06 np0005601977 nova_compute[183130]: 2026-01-30 09:27:06.362 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.367 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.391 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Triggering sync for uuid 0b877545-56fc-40ba-b8dc-bae466bb2064 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.392 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Triggering sync for uuid cff69cb0-f990-4189-91a5-729a7dcfe813 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.392 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Triggering sync for uuid 7a073e24-c800-4962-af5e-ff5400800f34 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.392 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.392 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.393 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.394 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "7a073e24-c800-4962-af5e-ff5400800f34" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.394 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "7a073e24-c800-4962-af5e-ff5400800f34" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.460 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "7a073e24-c800-4962-af5e-ff5400800f34" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.464 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.529 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.530 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.531 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.531 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.532 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.532 183134 WARNING nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.533 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.533 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.534 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.534 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.534 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.535 183134 WARNING nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.535 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.536 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.536 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.537 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.537 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.538 183134 WARNING nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.538 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.539 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.539 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.540 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.540 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.541 183134 WARNING nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.541 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.542 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.542 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.543 183134 DEBUG oslo_concurrency.lockutils [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.543 183134 DEBUG nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.543 183134 WARNING nova.compute.manager [req-c10cd0e1-4147-4c96-925c-f9dca76482a1 req-833b2f7e-d427-4490-9f68-07fbd47ce390 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:27:07 np0005601977 nova_compute[183130]: 2026-01-30 09:27:07.611 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Successfully created port: 75563b45-d61c-4438-900a-9ae4c5d964cb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:27:07 np0005601977 podman[214457]: 2026-01-30 09:27:07.84954197 +0000 UTC m=+0.065174141 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:27:07 np0005601977 podman[214456]: 2026-01-30 09:27:07.854892983 +0000 UTC m=+0.069908016 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9/ubi-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.204 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.299 183134 DEBUG nova.compute.manager [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.300 183134 DEBUG nova.compute.manager [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing instance network info cache due to event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.300 183134 DEBUG oslo_concurrency.lockutils [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.301 183134 DEBUG oslo_concurrency.lockutils [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:27:08 np0005601977 nova_compute[183130]: 2026-01-30 09:27:08.301 183134 DEBUG nova.network.neutron [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.464 183134 DEBUG nova.network.neutron [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updated VIF entry in instance network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.465 183134 DEBUG nova.network.neutron [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.486 183134 DEBUG oslo_concurrency.lockutils [req-6733386a-87bc-4f7f-b8a8-1a2226507cbc req-d6524575-0be6-4d10-b556-5941759cd92f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.747 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.846 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765214.8451972, 9c98ea59-db8f-40da-830b-351a58e44561 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.847 183134 INFO nova.compute.manager [-] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.872 183134 DEBUG nova.compute.manager [None req-6ddad3d3-4ff3-436f-a561-f51ed13c66f9 - - - - - -] [instance: 9c98ea59-db8f-40da-830b-351a58e44561] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.982 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Successfully updated port: 75563b45-d61c-4438-900a-9ae4c5d964cb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:27:09 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.999 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:09.999 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:10.000 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:10.303 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:10.393 183134 DEBUG nova.compute.manager [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-changed-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:10.393 183134 DEBUG nova.compute.manager [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Refreshing instance network info cache due to event network-changed-75563b45-d61c-4438-900a-9ae4c5d964cb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:27:10 np0005601977 nova_compute[183130]: 2026-01-30 09:27:10.395 183134 DEBUG oslo_concurrency.lockutils [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.722 183134 DEBUG nova.network.neutron [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Updating instance_info_cache with network_info: [{"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.742 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.743 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Instance network_info: |[{"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.743 183134 DEBUG oslo_concurrency.lockutils [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.743 183134 DEBUG nova.network.neutron [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Refreshing network info cache for port 75563b45-d61c-4438-900a-9ae4c5d964cb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.745 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Start _get_guest_xml network_info=[{"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.752 183134 WARNING nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.757 183134 DEBUG nova.virt.libvirt.host [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.758 183134 DEBUG nova.virt.libvirt.host [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.772 183134 DEBUG nova.virt.libvirt.host [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.773 183134 DEBUG nova.virt.libvirt.host [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.774 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.775 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.776 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.776 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.777 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.777 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.777 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.778 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.778 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.779 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.779 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.779 183134 DEBUG nova.virt.hardware [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.785 183134 DEBUG nova.virt.libvirt.vif [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:27:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=15,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-21xnkc98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:27:04Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=cff69cb0-f990-4189-91a5-729a7dcfe813,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.786 183134 DEBUG nova.network.os_vif_util [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.787 183134 DEBUG nova.network.os_vif_util [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.788 183134 DEBUG nova.objects.instance [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid cff69cb0-f990-4189-91a5-729a7dcfe813 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.803 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <uuid>cff69cb0-f990-4189-91a5-729a7dcfe813</uuid>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <name>instance-0000000f</name>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080</nova:name>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:27:11</nova:creationTime>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        <nova:port uuid="75563b45-d61c-4438-900a-9ae4c5d964cb">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="serial">cff69cb0-f990-4189-91a5-729a7dcfe813</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="uuid">cff69cb0-f990-4189-91a5-729a7dcfe813</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.config"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:ff:4e:aa"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <target dev="tap75563b45-d6"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/console.log" append="off"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:27:11 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:27:11 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:27:11 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:27:11 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.804 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Preparing to wait for external event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.805 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.805 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.805 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.806 183134 DEBUG nova.virt.libvirt.vif [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:27:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=15,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-21xnkc98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:27:04Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=cff69cb0-f990-4189-91a5-729a7dcfe813,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.807 183134 DEBUG nova.network.os_vif_util [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.807 183134 DEBUG nova.network.os_vif_util [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.808 183134 DEBUG os_vif [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.809 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.809 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.810 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.812 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.813 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75563b45-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.813 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75563b45-d6, col_values=(('external_ids', {'iface-id': '75563b45-d61c-4438-900a-9ae4c5d964cb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:4e:aa', 'vm-uuid': 'cff69cb0-f990-4189-91a5-729a7dcfe813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.816 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:11 np0005601977 NetworkManager[55565]: <info>  [1769765231.8180] manager: (tap75563b45-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.823 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.824 183134 INFO os_vif [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6')#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.883 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.884 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.884 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:ff:4e:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:27:11 np0005601977 nova_compute[183130]: 2026-01-30 09:27:11.884 183134 INFO nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Using config drive#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.260 183134 INFO nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Creating config drive at /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.config#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.264 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkddwrl1v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.380 183134 DEBUG oslo_concurrency.processutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkddwrl1v" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:12 np0005601977 kernel: tap75563b45-d6: entered promiscuous mode
Jan 30 04:27:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:12Z|00100|binding|INFO|Claiming lport 75563b45-d61c-4438-900a-9ae4c5d964cb for this chassis.
Jan 30 04:27:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:12Z|00101|binding|INFO|75563b45-d61c-4438-900a-9ae4c5d964cb: Claiming fa:16:3e:ff:4e:aa 10.100.0.14
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.418 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:12 np0005601977 NetworkManager[55565]: <info>  [1769765232.4199] manager: (tap75563b45-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.424 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:12Z|00102|binding|INFO|Setting lport 75563b45-d61c-4438-900a-9ae4c5d964cb ovn-installed in OVS
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.427 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:4e:aa 10.100.0.14'], port_security=['fa:16:3e:ff:4e:aa 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'cff69cb0-f990-4189-91a5-729a7dcfe813', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95f0b6cb-f834-48b3-a422-9f55b7068495', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5402d645-fdcd-44ae-9cdd-5c60cb019856, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=75563b45-d61c-4438-900a-9ae4c5d964cb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:12Z|00103|binding|INFO|Setting lport 75563b45-d61c-4438-900a-9ae4c5d964cb up in Southbound
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.428 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.429 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 75563b45-d61c-4438-900a-9ae4c5d964cb in datapath 4b6635c4-cf50-4be3-bead-3fd5f833ac92 bound to our chassis#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.432 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b6635c4-cf50-4be3-bead-3fd5f833ac92#033[00m
Jan 30 04:27:12 np0005601977 systemd-udevd[214514]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.448 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bb756560-5875-40c0-8316-bfd2dd4a46a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 NetworkManager[55565]: <info>  [1769765232.4538] device (tap75563b45-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:27:12 np0005601977 NetworkManager[55565]: <info>  [1769765232.4544] device (tap75563b45-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:27:12 np0005601977 systemd-machined[154431]: New machine qemu-8-instance-0000000f.
Jan 30 04:27:12 np0005601977 systemd[1]: Started Virtual Machine qemu-8-instance-0000000f.
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.470 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[62679f7d-3bbd-485c-bc0d-794ec0ed1b0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.473 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[140f0faf-c763-472f-ba9d-e713455d678e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.492 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[6945640e-6207-44c6-8734-266c551805f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.507 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2e825f23-7a4b-4ec2-84de-7da1badcc27f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b6635c4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372413, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214527, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.520 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1c07ee87-8fa4-46dd-a0be-22ee8610d0ab]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b6635c4-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372421, 'tstamp': 372421}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214529, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b6635c4-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372423, 'tstamp': 372423}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214529, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.522 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6635c4-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.559 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.562 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b6635c4-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.562 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.563 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b6635c4-c0, col_values=(('external_ids', {'iface-id': '50e65278-673c-4dc7-b450-b3f067941ab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:12.564 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.566 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.704 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765232.703685, cff69cb0-f990-4189-91a5-729a7dcfe813 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.704 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] VM Started (Lifecycle Event)#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.781 183134 DEBUG nova.compute.manager [req-87e50568-b95c-4562-a71e-3396ffa99c4b req-63634bb6-8449-43f3-bc20-d95b008135c7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.781 183134 DEBUG oslo_concurrency.lockutils [req-87e50568-b95c-4562-a71e-3396ffa99c4b req-63634bb6-8449-43f3-bc20-d95b008135c7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.782 183134 DEBUG oslo_concurrency.lockutils [req-87e50568-b95c-4562-a71e-3396ffa99c4b req-63634bb6-8449-43f3-bc20-d95b008135c7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.782 183134 DEBUG oslo_concurrency.lockutils [req-87e50568-b95c-4562-a71e-3396ffa99c4b req-63634bb6-8449-43f3-bc20-d95b008135c7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.782 183134 DEBUG nova.compute.manager [req-87e50568-b95c-4562-a71e-3396ffa99c4b req-63634bb6-8449-43f3-bc20-d95b008135c7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Processing event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.783 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.786 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.788 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.791 183134 INFO nova.virt.libvirt.driver [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Instance spawned successfully.#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.791 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:27:12 np0005601977 nova_compute[183130]: 2026-01-30 09:27:12.793 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.020 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.021 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.022 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.022 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.023 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.024 183134 DEBUG nova.virt.libvirt.driver [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.030 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.030 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765232.7037842, cff69cb0-f990-4189-91a5-729a7dcfe813 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.031 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.086 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.113 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.116 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765232.7855656, cff69cb0-f990-4189-91a5-729a7dcfe813 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.117 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.161 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.165 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.170 183134 INFO nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Took 8.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.171 183134 DEBUG nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.183 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.236 183134 INFO nova.compute.manager [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Took 8.63 seconds to build instance.#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.266 183134 DEBUG oslo_concurrency.lockutils [None req-37845464-321b-4def-a5e2-579496f76deb 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.267 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 5.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.267 183134 INFO nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:27:13 np0005601977 nova_compute[183130]: 2026-01-30 09:27:13.268 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:14 np0005601977 nova_compute[183130]: 2026-01-30 09:27:14.144 183134 DEBUG nova.network.neutron [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Updated VIF entry in instance network info cache for port 75563b45-d61c-4438-900a-9ae4c5d964cb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:27:14 np0005601977 nova_compute[183130]: 2026-01-30 09:27:14.144 183134 DEBUG nova.network.neutron [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Updating instance_info_cache with network_info: [{"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:14 np0005601977 nova_compute[183130]: 2026-01-30 09:27:14.163 183134 DEBUG oslo_concurrency.lockutils [req-080b8178-5a7b-4445-9e6b-033cbe5c9221 req-120dc4ae-aa4d-400d-95e1-1d33caa27817 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-cff69cb0-f990-4189-91a5-729a7dcfe813" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:27:14 np0005601977 nova_compute[183130]: 2026-01-30 09:27:14.794 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.234 183134 DEBUG nova.compute.manager [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.235 183134 DEBUG oslo_concurrency.lockutils [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.235 183134 DEBUG oslo_concurrency.lockutils [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.235 183134 DEBUG oslo_concurrency.lockutils [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.235 183134 DEBUG nova.compute.manager [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] No waiting events found dispatching network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:15 np0005601977 nova_compute[183130]: 2026-01-30 09:27:15.236 183134 WARNING nova.compute.manager [req-4330a0be-1cd7-4dc4-96be-671650eb7024 req-fc2bee98-26ec-4455-9bd7-80b30cfac31d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received unexpected event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb for instance with vm_state active and task_state None.#033[00m
Jan 30 04:27:16 np0005601977 nova_compute[183130]: 2026-01-30 09:27:16.817 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:16 np0005601977 podman[214537]: 2026-01-30 09:27:16.833780129 +0000 UTC m=+0.049672851 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:27:16 np0005601977 podman[214538]: 2026-01-30 09:27:16.834682555 +0000 UTC m=+0.051242786 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.618 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.645 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765222.643545, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.645 183134 INFO nova.compute.manager [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.669 183134 DEBUG nova.compute.manager [None req-8a45d7e3-91de-4108-9770-f81e5f8b4236 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.672 183134 DEBUG nova.compute.manager [None req-8a45d7e3-91de-4108-9770-f81e5f8b4236 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_migrated, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:27:17 np0005601977 nova_compute[183130]: 2026-01-30 09:27:17.724 183134 INFO nova.compute.manager [None req-8a45d7e3-91de-4108-9770-f81e5f8b4236 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com#033[00m
Jan 30 04:27:19 np0005601977 nova_compute[183130]: 2026-01-30 09:27:19.797 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:21 np0005601977 nova_compute[183130]: 2026-01-30 09:27:21.861 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:21 np0005601977 podman[214576]: 2026-01-30 09:27:21.875507518 +0000 UTC m=+0.096761247 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:27:24 np0005601977 nova_compute[183130]: 2026-01-30 09:27:24.800 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:24Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:4e:aa 10.100.0.14
Jan 30 04:27:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:24Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:4e:aa 10.100.0.14
Jan 30 04:27:26 np0005601977 nova_compute[183130]: 2026-01-30 09:27:26.898 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:27.380 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:27 np0005601977 nova_compute[183130]: 2026-01-30 09:27:27.380 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:27.381 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:27:28 np0005601977 nova_compute[183130]: 2026-01-30 09:27:28.313 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:29 np0005601977 nova_compute[183130]: 2026-01-30 09:27:29.801 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:29 np0005601977 podman[214615]: 2026-01-30 09:27:29.829620816 +0000 UTC m=+0.052543653 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:27:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:31.383 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:31 np0005601977 nova_compute[183130]: 2026-01-30 09:27:31.900 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.530 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.531 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.531 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.531 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.532 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.533 183134 INFO nova.compute.manager [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Terminating instance#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.535 183134 DEBUG nova.compute.manager [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:27:32 np0005601977 kernel: tap75563b45-d6 (unregistering): left promiscuous mode
Jan 30 04:27:32 np0005601977 NetworkManager[55565]: <info>  [1769765252.5593] device (tap75563b45-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:27:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:32Z|00104|binding|INFO|Releasing lport 75563b45-d61c-4438-900a-9ae4c5d964cb from this chassis (sb_readonly=0)
Jan 30 04:27:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:32Z|00105|binding|INFO|Setting lport 75563b45-d61c-4438-900a-9ae4c5d964cb down in Southbound
Jan 30 04:27:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:32Z|00106|binding|INFO|Removing iface tap75563b45-d6 ovn-installed in OVS
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.602 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.603 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.607 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:4e:aa 10.100.0.14'], port_security=['fa:16:3e:ff:4e:aa 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'cff69cb0-f990-4189-91a5-729a7dcfe813', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95f0b6cb-f834-48b3-a422-9f55b7068495', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5402d645-fdcd-44ae-9cdd-5c60cb019856, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=75563b45-d61c-4438-900a-9ae4c5d964cb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.608 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.608 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 75563b45-d61c-4438-900a-9ae4c5d964cb in datapath 4b6635c4-cf50-4be3-bead-3fd5f833ac92 unbound from our chassis#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.610 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b6635c4-cf50-4be3-bead-3fd5f833ac92#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.621 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[10592805-5a03-4049-b1d1-98dfb22eb425]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.640 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bf1ff5-c44f-4e40-a74a-c7c14995533a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.642 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[05752201-8124-4d09-934c-1f8b9ed10c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 30 04:27:32 np0005601977 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Consumed 11.840s CPU time.
Jan 30 04:27:32 np0005601977 systemd-machined[154431]: Machine qemu-8-instance-0000000f terminated.
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.657 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[241d5384-aeae-4541-b8b0-278a8b74ff7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.669 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c43366c7-26fa-4293-81c5-a205b33b7ac1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b6635c4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372413, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214652, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.680 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f37484-b5ad-4931-879e-17dbd1a799eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b6635c4-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372421, 'tstamp': 372421}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214653, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b6635c4-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372423, 'tstamp': 372423}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214653, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.682 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6635c4-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.683 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.686 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.687 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b6635c4-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.687 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.688 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b6635c4-c0, col_values=(('external_ids', {'iface-id': '50e65278-673c-4dc7-b450-b3f067941ab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:32.688 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.752 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.756 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.784 183134 INFO nova.virt.libvirt.driver [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Instance destroyed successfully.#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.784 183134 DEBUG nova.objects.instance [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid cff69cb0-f990-4189-91a5-729a7dcfe813 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.803 183134 DEBUG nova.virt.libvirt.vif [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:27:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-0-340698080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=15,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:27:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-21xnkc98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:27:13Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=cff69cb0-f990-4189-91a5-729a7dcfe813,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.803 183134 DEBUG nova.network.os_vif_util [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "75563b45-d61c-4438-900a-9ae4c5d964cb", "address": "fa:16:3e:ff:4e:aa", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75563b45-d6", "ovs_interfaceid": "75563b45-d61c-4438-900a-9ae4c5d964cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.804 183134 DEBUG nova.network.os_vif_util [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.805 183134 DEBUG os_vif [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.806 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.807 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75563b45-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.809 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.811 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.813 183134 INFO os_vif [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:4e:aa,bridge_name='br-int',has_traffic_filtering=True,id=75563b45-d61c-4438-900a-9ae4c5d964cb,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75563b45-d6')#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.814 183134 INFO nova.virt.libvirt.driver [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Deleting instance files /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813_del#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.815 183134 INFO nova.virt.libvirt.driver [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Deletion of /var/lib/nova/instances/cff69cb0-f990-4189-91a5-729a7dcfe813_del complete#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.859 183134 INFO nova.compute.manager [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.859 183134 DEBUG oslo.service.loopingcall [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.860 183134 DEBUG nova.compute.manager [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:27:32 np0005601977 nova_compute[183130]: 2026-01-30 09:27:32.860 183134 DEBUG nova.network.neutron [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:27:33 np0005601977 nova_compute[183130]: 2026-01-30 09:27:33.961 183134 DEBUG nova.network.neutron [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:33 np0005601977 nova_compute[183130]: 2026-01-30 09:27:33.988 183134 INFO nova.compute.manager [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Took 1.13 seconds to deallocate network for instance.#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.043 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.043 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.153 183134 DEBUG nova.compute.provider_tree [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.173 183134 DEBUG nova.scheduler.client.report [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.188 183134 DEBUG nova.compute.manager [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-vif-unplugged-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.189 183134 DEBUG oslo_concurrency.lockutils [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.189 183134 DEBUG oslo_concurrency.lockutils [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.190 183134 DEBUG oslo_concurrency.lockutils [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.190 183134 DEBUG nova.compute.manager [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] No waiting events found dispatching network-vif-unplugged-75563b45-d61c-4438-900a-9ae4c5d964cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.190 183134 WARNING nova.compute.manager [req-ec4f838b-0ddf-43ca-b926-1ef495b83aee req-7cda0c43-2e57-4ad9-8a8c-9a808893e2a1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received unexpected event network-vif-unplugged-75563b45-d61c-4438-900a-9ae4c5d964cb for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.201 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.230 183134 INFO nova.scheduler.client.report [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance cff69cb0-f990-4189-91a5-729a7dcfe813#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.321 183134 DEBUG oslo_concurrency.lockutils [None req-99f7b2e7-959d-4e5d-b4b5-e266ebb0f2f9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.343 183134 DEBUG nova.compute.manager [req-50c374a0-5013-4856-b3e8-1f4b579d97bb req-4d9f4812-cd36-47f4-a252-d617fa745535 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-vif-deleted-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:34 np0005601977 nova_compute[183130]: 2026-01-30 09:27:34.804 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:35 np0005601977 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.651 183134 DEBUG nova.compute.manager [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.652 183134 DEBUG oslo_concurrency.lockutils [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.652 183134 DEBUG oslo_concurrency.lockutils [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.652 183134 DEBUG oslo_concurrency.lockutils [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "cff69cb0-f990-4189-91a5-729a7dcfe813-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.653 183134 DEBUG nova.compute.manager [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] No waiting events found dispatching network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:36 np0005601977 nova_compute[183130]: 2026-01-30 09:27:36.653 183134 WARNING nova.compute.manager [req-864765c4-cf7a-49e9-bdab-7607a70a59e5 req-b5480b20-a809-404d-a6a4-f6341b4a4a05 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Received unexpected event network-vif-plugged-75563b45-d61c-4438-900a-9ae4c5d964cb for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.420 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.421 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.421 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.421 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.421 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.422 183134 INFO nova.compute.manager [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Terminating instance#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.423 183134 DEBUG nova.compute.manager [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:27:37 np0005601977 kernel: tap94783e16-93 (unregistering): left promiscuous mode
Jan 30 04:27:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:37Z|00107|binding|INFO|Releasing lport 94783e16-931b-49a5-9848-2b5c6206ac8a from this chassis (sb_readonly=0)
Jan 30 04:27:37 np0005601977 NetworkManager[55565]: <info>  [1769765257.4434] device (tap94783e16-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:27:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:37Z|00108|binding|INFO|Setting lport 94783e16-931b-49a5-9848-2b5c6206ac8a down in Southbound
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.443 183134 DEBUG nova.compute.manager [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.444 183134 DEBUG nova.compute.manager [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing instance network info cache due to event network-changed-94783e16-931b-49a5-9848-2b5c6206ac8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.444 183134 DEBUG oslo_concurrency.lockutils [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.444 183134 DEBUG oslo_concurrency.lockutils [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.444 183134 DEBUG nova.network.neutron [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Refreshing network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:27:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:37Z|00109|binding|INFO|Removing iface tap94783e16-93 ovn-installed in OVS
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.445 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.446 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.449 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.474 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:bf:ec 10.100.0.12'], port_security=['fa:16:3e:87:bf:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0b877545-56fc-40ba-b8dc-bae466bb2064', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95f0b6cb-f834-48b3-a422-9f55b7068495 eecfbcd8-2f91-46a3-95ca-ae6e61909029', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5402d645-fdcd-44ae-9cdd-5c60cb019856, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=94783e16-931b-49a5-9848-2b5c6206ac8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.476 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 94783e16-931b-49a5-9848-2b5c6206ac8a in datapath 4b6635c4-cf50-4be3-bead-3fd5f833ac92 unbound from our chassis#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.481 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b6635c4-cf50-4be3-bead-3fd5f833ac92, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.482 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0973605a-583a-4366-806a-e56915013ee3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.482 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92 namespace which is not needed anymore#033[00m
Jan 30 04:27:37 np0005601977 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 30 04:27:37 np0005601977 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Consumed 13.920s CPU time.
Jan 30 04:27:37 np0005601977 systemd-machined[154431]: Machine qemu-7-instance-0000000d terminated.
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [NOTICE]   (214064) : haproxy version is 2.8.14-c23fe91
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [NOTICE]   (214064) : path to executable is /usr/sbin/haproxy
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [WARNING]  (214064) : Exiting Master process...
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [WARNING]  (214064) : Exiting Master process...
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [ALERT]    (214064) : Current worker (214066) exited with code 143 (Terminated)
Jan 30 04:27:37 np0005601977 neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92[214060]: [WARNING]  (214064) : All workers exited. Exiting... (0)
Jan 30 04:27:37 np0005601977 systemd[1]: libpod-5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48.scope: Deactivated successfully.
Jan 30 04:27:37 np0005601977 podman[214696]: 2026-01-30 09:27:37.591240667 +0000 UTC m=+0.037045157 container died 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:27:37 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48-userdata-shm.mount: Deactivated successfully.
Jan 30 04:27:37 np0005601977 systemd[1]: var-lib-containers-storage-overlay-5144771f2afdfe1a79ccfba01f37f0bdc8fd174f88b73b51054ced42acf5cd3d-merged.mount: Deactivated successfully.
Jan 30 04:27:37 np0005601977 podman[214696]: 2026-01-30 09:27:37.62471797 +0000 UTC m=+0.070522460 container cleanup 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.636 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.639 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 systemd[1]: libpod-conmon-5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48.scope: Deactivated successfully.
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.664 183134 INFO nova.virt.libvirt.driver [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Instance destroyed successfully.#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.664 183134 DEBUG nova.objects.instance [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid 0b877545-56fc-40ba-b8dc-bae466bb2064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:27:37 np0005601977 podman[214727]: 2026-01-30 09:27:37.673426662 +0000 UTC m=+0.034290148 container remove 5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.677 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[41918a53-83df-4293-b48e-8b16d8e6f982]: (4, ('Fri Jan 30 09:27:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92 (5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48)\n5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48\nFri Jan 30 09:27:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92 (5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48)\n5d932488b780c9904196fc30d1054028bfb7a46633b0e7d55caf69cd737ccf48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.678 183134 DEBUG nova.virt.libvirt.vif [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-739901063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=13,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAVkv4mmVB5UaJ/8a+W0yTjdNMEDMDWZjiKyZ3lwpRNOPSKB1Q29ZN8O4ciWDNgK73rVrAmTwaP6vS7FsjkG0RG4rLQ39r9X3YRlBvA05Mvzev06/wQvZTphpUIMAHVVOA==',key_name='tempest-TestSecurityGroupsBasicOps-1567987350',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:26:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-dze3j61d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:26:42Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=0b877545-56fc-40ba-b8dc-bae466bb2064,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.678 183134 DEBUG nova.network.os_vif_util [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.678 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[26efb9df-15b6-4ba9-b43d-32d0ce435283]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.678 183134 DEBUG nova.network.os_vif_util [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.679 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6635c4-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.679 183134 DEBUG os_vif [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.680 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.680 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94783e16-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:37 np0005601977 kernel: tap4b6635c4-c0: left promiscuous mode
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.682 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.685 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.688 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.689 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.690 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7effebb9-f6d2-44e6-a445-9e5f83e9b6f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.691 183134 INFO os_vif [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:bf:ec,bridge_name='br-int',has_traffic_filtering=True,id=94783e16-931b-49a5-9848-2b5c6206ac8a,network=Network(4b6635c4-cf50-4be3-bead-3fd5f833ac92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94783e16-93')#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.691 183134 INFO nova.virt.libvirt.driver [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Deleting instance files /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064_del#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.691 183134 INFO nova.virt.libvirt.driver [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Deletion of /var/lib/nova/instances/0b877545-56fc-40ba-b8dc-bae466bb2064_del complete#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.703 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c82c0a40-75c5-4ece-883a-b3510ee01c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.704 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea047b6-c1fa-4a5e-b16a-7c9f18ba4b7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.719 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc145a6-1cf0-44e0-af18-5149f41983d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372407, 'reachable_time': 22224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214757, 'error': None, 'target': 'ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.720 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b6635c4-cf50-4be3-bead-3fd5f833ac92 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:27:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:37.720 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[6c095a56-a23a-45c3-aaf0-14f0471cc983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:27:37 np0005601977 systemd[1]: run-netns-ovnmeta\x2d4b6635c4\x2dcf50\x2d4be3\x2dbead\x2d3fd5f833ac92.mount: Deactivated successfully.
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.745 183134 INFO nova.compute.manager [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.746 183134 DEBUG oslo.service.loopingcall [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.748 183134 DEBUG nova.compute.manager [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:27:37 np0005601977 nova_compute[183130]: 2026-01-30 09:27:37.748 183134 DEBUG nova.network.neutron [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:27:38 np0005601977 podman[214758]: 2026-01-30 09:27:38.830249217 +0000 UTC m=+0.052026759 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 30 04:27:38 np0005601977 podman[214759]: 2026-01-30 09:27:38.861032063 +0000 UTC m=+0.080507989 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.353 183134 DEBUG nova.network.neutron [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updated VIF entry in instance network info cache for port 94783e16-931b-49a5-9848-2b5c6206ac8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.354 183134 DEBUG nova.network.neutron [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updating instance_info_cache with network_info: [{"id": "94783e16-931b-49a5-9848-2b5c6206ac8a", "address": "fa:16:3e:87:bf:ec", "network": {"id": "4b6635c4-cf50-4be3-bead-3fd5f833ac92", "bridge": "br-int", "label": "tempest-network-smoke--209588457", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94783e16-93", "ovs_interfaceid": "94783e16-931b-49a5-9848-2b5c6206ac8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.372 183134 DEBUG nova.network.neutron [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.375 183134 DEBUG oslo_concurrency.lockutils [req-2e35fed0-0e1c-49e9-8c48-1b3b1a41abed req-9600cc28-3cbe-46dc-ae91-6945c59ec7e2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0b877545-56fc-40ba-b8dc-bae466bb2064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.392 183134 INFO nova.compute.manager [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Took 1.64 seconds to deallocate network for instance.#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.436 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.437 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.541 183134 DEBUG nova.compute.provider_tree [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.559 183134 DEBUG nova.scheduler.client.report [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.584 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.611 183134 INFO nova.scheduler.client.report [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance 0b877545-56fc-40ba-b8dc-bae466bb2064#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.676 183134 DEBUG oslo_concurrency.lockutils [None req-b4af985a-d0e0-474b-93a3-56274df565cc 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.743 183134 DEBUG nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-vif-unplugged-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.744 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.745 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.745 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.746 183134 DEBUG nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] No waiting events found dispatching network-vif-unplugged-94783e16-931b-49a5-9848-2b5c6206ac8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.746 183134 WARNING nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received unexpected event network-vif-unplugged-94783e16-931b-49a5-9848-2b5c6206ac8a for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.747 183134 DEBUG nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.747 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.748 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.748 183134 DEBUG oslo_concurrency.lockutils [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0b877545-56fc-40ba-b8dc-bae466bb2064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.749 183134 DEBUG nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] No waiting events found dispatching network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.749 183134 WARNING nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received unexpected event network-vif-plugged-94783e16-931b-49a5-9848-2b5c6206ac8a for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.749 183134 DEBUG nova.compute.manager [req-96c078a9-bf35-4a66-9ec2-e8d740fc469c req-c4bc5486-0b4a-4dc5-8e6b-9b7f755e1a64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Received event network-vif-deleted-94783e16-931b-49a5-9848-2b5c6206ac8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:27:39 np0005601977 nova_compute[183130]: 2026-01-30 09:27:39.806 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:42 np0005601977 nova_compute[183130]: 2026-01-30 09:27:42.723 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:42Z|00110|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:27:42 np0005601977 nova_compute[183130]: 2026-01-30 09:27:42.838 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:27:42Z|00111|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:27:42 np0005601977 nova_compute[183130]: 2026-01-30 09:27:42.944 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:44 np0005601977 nova_compute[183130]: 2026-01-30 09:27:44.843 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:47 np0005601977 nova_compute[183130]: 2026-01-30 09:27:47.726 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:47 np0005601977 nova_compute[183130]: 2026-01-30 09:27:47.783 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765252.782182, cff69cb0-f990-4189-91a5-729a7dcfe813 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:47 np0005601977 nova_compute[183130]: 2026-01-30 09:27:47.784 183134 INFO nova.compute.manager [-] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:27:47 np0005601977 nova_compute[183130]: 2026-01-30 09:27:47.804 183134 DEBUG nova.compute.manager [None req-e1c5eb0c-46fd-4d03-b2b7-909007ce0e36 - - - - - -] [instance: cff69cb0-f990-4189-91a5-729a7dcfe813] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:47 np0005601977 podman[214799]: 2026-01-30 09:27:47.842942529 +0000 UTC m=+0.055027965 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:27:47 np0005601977 podman[214798]: 2026-01-30 09:27:47.849013594 +0000 UTC m=+0.064807457 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:27:49 np0005601977 nova_compute[183130]: 2026-01-30 09:27:49.845 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:52 np0005601977 nova_compute[183130]: 2026-01-30 09:27:52.663 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765257.662369, 0b877545-56fc-40ba-b8dc-bae466bb2064 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:27:52 np0005601977 nova_compute[183130]: 2026-01-30 09:27:52.663 183134 INFO nova.compute.manager [-] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:27:52 np0005601977 nova_compute[183130]: 2026-01-30 09:27:52.681 183134 DEBUG nova.compute.manager [None req-c24f183d-5f9d-44f7-9057-d79cba021094 - - - - - -] [instance: 0b877545-56fc-40ba-b8dc-bae466bb2064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:27:52 np0005601977 nova_compute[183130]: 2026-01-30 09:27:52.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:52 np0005601977 podman[214837]: 2026-01-30 09:27:52.880862267 +0000 UTC m=+0.090973100 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:27:54 np0005601977 nova_compute[183130]: 2026-01-30 09:27:54.846 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.450 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7a073e24-c800-4962-af5e-ff5400800f34', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'hostId': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'name': 'tempest-TestNetworkAdvancedServerOps-server-809581554', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'hostId': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.457 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.459 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6172cb2-d7b3-47a7-92cd-ec96053748cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.453662', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f54cba32-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': '16a475df8b0d99e3ab4a551026350bd18ed70205c5a18b6d0404ca5e8366555d'}]}, 'timestamp': '2026-01-30 09:27:55.459734', '_unique_id': '4ba7c9299add42f08ebb75cb5382a465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.461 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.480 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/cpu volume: 670000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.481 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '405f0e55-a0b6-4305-b7a6-28b7df6f86ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 670000000, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:27:55.462999', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f55032e8-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.874813584, 'message_signature': '70d011752b0e7fea2b51475a5a7290744d49db830be954b360c582350fade51d'}]}, 'timestamp': '2026-01-30 09:27:55.481806', '_unique_id': 'e58050649f9c435cbe9e9a5a07b30ad4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.500 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.501 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.502 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dd166a8-5b51-4983-a831-a88272d43a1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.486503', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5534cc6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': '69bf70c588518c162a32ffeb55de99690cabfd32e35991fedf00f51c08b7068f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.486503', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f553686e-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': '57300906456dfb984b492ce0584a51c47c171ed07df4e94c8a21a8b4c3993929'}]}, 'timestamp': '2026-01-30 09:27:55.503141', '_unique_id': '9d9697dd3d9c4825b19daeff12501783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.507 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.508 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '864a76ab-26b5-48e6-9e22-0362b8a27c23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.506991', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f5544a18-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': 'ee9b0b5d4d3ec8b0aad985c11cf31ee9c585e1ebae3080834de23a4bf2c3f1e0'}]}, 'timestamp': '2026-01-30 09:27:55.508860', '_unique_id': 'a9cb4a61edc84b1285ceecfd9899a2e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.512 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.512 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>]
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.513 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/memory.usage volume: 42.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.514 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfff95ff-d2fa-4850-bc22-1c061e77926f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4140625, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:27:55.513142', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f5553ac2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.874813584, 'message_signature': '99f56bc974d884a241177faa0fd8b7f769c6fd4ef9cbe4714c070c6eb18a4451'}]}, 'timestamp': '2026-01-30 09:27:55.515059', '_unique_id': '5cf92d7dcb054aa68886a7d55e44a299'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.518 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.518 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 30617600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.518 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.520 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87e7054a-fbd3-4be8-9252-ea508de5f686', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30617600, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.518269', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5560182-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': 'd37e0731947ab6cf24f33783e35bd6c3aba9fd0107c617442e4be162a859c6bb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.518269', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f556173a-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': '1270db2501759c7fb9abe61ab942de88c0f920afaf32e7b6799ebcbb323df73e'}]}, 'timestamp': '2026-01-30 09:27:55.520408', '_unique_id': 'f6b7ee4ae083424d9cbec3c0ba6b24e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.522 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.523 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>]
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.523 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.524 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23e58c0b-e416-4ff2-a8aa-2fac22794ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.523549', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f556cbd0-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': '22d015405c0ccc7ffe5b8993276a368ea7162dac59ed0480c17d44049075c6d5'}]}, 'timestamp': '2026-01-30 09:27:55.524953', '_unique_id': '4907abb765004bf68a5ae55a7a47aaea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.554 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.555 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.556 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb18c7e1-c514-4694-8d44-81208caafcb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.527065', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55b9782-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '62524a884b2c251ee3dd681465c4ac0995f0df8130fb989fb21c42bfb1acf79e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.527065', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55ba22c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '30a4f3fce4cbef04d9e48a49b80b115d645c165fda4ae70965b25854f9d1b725'}]}, 'timestamp': '2026-01-30 09:27:55.556441', '_unique_id': '8a769f852e86488898517c838429bb57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.557 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.558 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f33a8ee5-e5c0-4e74-a419-c3dc6478f79f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.557947', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55c06f4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'cee4e8948179472f3ebd27105598244e7664fb1fc265c1f3622632a59643474e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.557947', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55c0fd2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '4171cca2a57db854cbf2d8b2cb1a382cef36a2a8b0c264574020a8437ef95cc6'}]}, 'timestamp': '2026-01-30 09:27:55.559424', '_unique_id': '61c8392ed81c423489373568b02c5c71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>]
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.560 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-809581554>]
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.561 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.561 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes volume: 790 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.561 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af5acf11-9054-4f07-bfda-0ad030d5c16e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 790, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.561094', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55c8250-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': 'a3374a29d935426aa6e2cbf8807c29f8f7cf8e2cb0355c75648e8f9eb9df874d'}]}, 'timestamp': '2026-01-30 09:27:55.561920', '_unique_id': '01af3833691648238420c904111fd73d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.563 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.563 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.563 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd753b652-2ace-44e8-8f16-1cdad06fe6aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.563044', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55ccecc-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': '31d5fdb845b0593db51290c91f3fe1f561d57f5544eb189f479167d902e37ac1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.563044', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55cd6c4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.881446925, 'message_signature': '990d4ee61e81c9d8157f7eaed389520f7e11f0d876aa15e9fd104a63dc08d8a8'}]}, 'timestamp': '2026-01-30 09:27:55.564093', '_unique_id': '8d401ab322844353b0d9f9f0955c5ab2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.565 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.565 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82ca75ca-9b27-41e0-ad31-3e1300d20b4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.565112', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55d1fda-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': '90a0f73342fe33c574130da1e35c305902edeb803482cd3a953fe0dc8494d292'}]}, 'timestamp': '2026-01-30 09:27:55.565954', '_unique_id': '0c3db03bacb447fea6dc5779a2f637cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.567 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.567 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '803e4fee-5733-441f-b8e6-1b369f098809', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.566981', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55d6882-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': 'b96fbfb4ade4e1ee5c6e73e60adcf6c1f978b49cf06215a682b4aa5f292f1226'}]}, 'timestamp': '2026-01-30 09:27:55.567898', '_unique_id': '5b41515da2a84c1eb2e1d2f98feef703'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.568 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.569 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '403e8d51-bfbb-4ac3-ad08-198db9082cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.568890', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55db1f2-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': 'fe731fd5172427b513f3365395209c54e139f8ce2f7044c92e60b59e8396782e'}]}, 'timestamp': '2026-01-30 09:27:55.569699', '_unique_id': '926a4786d7654aa2acf7bdd89a1387dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.570 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.571 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62c21fd4-6102-4644-a3b8-0b623502fc9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.570689', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55df824-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '78a540bba10975f520baf7c92f2c8c00ac786e38c2577a01edcb6285ace4715c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.570689', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55e0134-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'c44291eebca201abb6159e4cbbcb0819773eaca0b57ed2a72a28210a67d60d64'}]}, 'timestamp': '2026-01-30 09:27:55.571707', '_unique_id': 'e2a0d296b75e4235bb59df3438876c34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.572 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.573 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feaaebe6-4292-4e1c-bcba-910889f72df7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.572732', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55e4806-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': '311146e1af3b810ef68994968844d0d46989a9c953845f2d1f8c1ce0acccf83c'}]}, 'timestamp': '2026-01-30 09:27:55.573608', '_unique_id': 'ab90858888ae446c8358ab32a89c88b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.574 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes.delta volume: 84 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8ad15a0-e6a7-4643-a1a8-a7bdabfe0c24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 84, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.574635', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55e925c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': 'a1d5161bdbccdb52ba801f86f89846f6bbe681d1441eeca4879802e367dc9cc4'}]}, 'timestamp': '2026-01-30 09:27:55.575511', '_unique_id': '1d45562d15a34de5b19ebb402dffb5ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.576 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e83881e9-803c-4072-a937-da274f4def6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:27:55.576497', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': 'f55edb2c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.848566068, 'message_signature': '1c15e3f7a4718fa5a790483ec70506ab64882d541cfddf5d9fe14e9bb6cebbdb'}]}, 'timestamp': '2026-01-30 09:27:55.577350', '_unique_id': 'a43cf0607f4e4915b02d568451dc1a1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.578 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.578 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9df9dc28-99a5-48d3-a271-aa0f0dc9a0e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.578346', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55f235c-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'edc9ced861a2d451465897e7b0483bf433630cf59860a5462daae2fe14147d9b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.578346', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55f2bf4-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '0454d6da78a74b5c951879b48e6abcb265163eda021877dda0ce5109649ef827'}]}, 'timestamp': '2026-01-30 09:27:55.579370', '_unique_id': '61b7d8593a4946aabbcbde4bfdbf075b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.580 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.580 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.580 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dd15d83-d0e7-4de7-9dfe-49fb85c3c282', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.580381', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55f72c6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': '7a3ae1e6faaac63ca6094b70eca91b8d3a48b09966abeb6da40921e112267041'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.580381', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55f7bd6-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'a5930ed386333e71f22b0d61fa9f016e377904d1de07d2fde5a4a2c7ce64e0c5'}]}, 'timestamp': '2026-01-30 09:27:55.581425', '_unique_id': '21cf76e4052444a8ad2ef9832b7c608e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.582 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.582 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.583 12 DEBUG ceilometer.compute.pollsters [-] Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000000c, id=37aaa571-2821-4d88-b360-9f7b02c1aa1b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '442df56f-1190-4832-a215-8e21a2e2be00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:27:55.582583', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f55fc9ba-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'd3764797758544fd306da89f4b9309fcb8a9ee581fc69014a652167822b37baa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:27:55.582583', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f55fd234-fdbd-11f0-a471-fa163eabe782', 'monotonic_time': 3806.922090584, 'message_signature': 'd49683ae0dd7d82e6ebec2ab9bd75a85ac6c591cfabaf096fdcf640f398ee08c'}]}, 'timestamp': '2026-01-30 09:27:55.583594', '_unique_id': '639aa4ba9960498ba02dbad6ada8101d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:27:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.366 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.431 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.476 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.477 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.517 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.522 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000c, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.632 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.634 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5466MB free_disk=73.30220794677734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.634 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.634 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.679 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration for instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.701 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating resource usage from migration 88445283-129b-4f87-a75d-ab640dcf2a52#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.701 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Starting to track outgoing migration 88445283-129b-4f87-a75d-ab640dcf2a52 with flavor 43faf4bc-65eb-437f-b3dc-707ebe898840 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.800 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.800 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration 88445283-129b-4f87-a75d-ab640dcf2a52 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.801 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.801 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.860 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.934 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.934 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.976 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:27:56 np0005601977 nova_compute[183130]: 2026-01-30 09:27:56.997 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:27:57 np0005601977 nova_compute[183130]: 2026-01-30 09:27:57.048 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:57 np0005601977 nova_compute[183130]: 2026-01-30 09:27:57.065 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:57 np0005601977 nova_compute[183130]: 2026-01-30 09:27:57.091 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:27:57 np0005601977 nova_compute[183130]: 2026-01-30 09:27:57.092 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:57.380 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:57.381 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:27:57.381 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:57 np0005601977 nova_compute[183130]: 2026-01-30 09:27:57.729 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:27:59 np0005601977 nova_compute[183130]: 2026-01-30 09:27:59.894 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:00 np0005601977 podman[214870]: 2026-01-30 09:28:00.857984664 +0000 UTC m=+0.067015110 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:28:02 np0005601977 nova_compute[183130]: 2026-01-30 09:28:02.093 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:02 np0005601977 nova_compute[183130]: 2026-01-30 09:28:02.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:02 np0005601977 nova_compute[183130]: 2026-01-30 09:28:02.733 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:03 np0005601977 nova_compute[183130]: 2026-01-30 09:28:03.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:03 np0005601977 nova_compute[183130]: 2026-01-30 09:28:03.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:28:03 np0005601977 nova_compute[183130]: 2026-01-30 09:28:03.630 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:03 np0005601977 nova_compute[183130]: 2026-01-30 09:28:03.631 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:03 np0005601977 nova_compute[183130]: 2026-01-30 09:28:03.631 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:28:04 np0005601977 nova_compute[183130]: 2026-01-30 09:28:04.861 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:04 np0005601977 nova_compute[183130]: 2026-01-30 09:28:04.861 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:04 np0005601977 nova_compute[183130]: 2026-01-30 09:28:04.895 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:04 np0005601977 nova_compute[183130]: 2026-01-30 09:28:04.899 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.097 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.097 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.105 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.106 183134 INFO nova.compute.claims [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.312 183134 DEBUG nova.compute.provider_tree [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.347 183134 DEBUG nova.scheduler.client.report [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.383 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.383 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.456 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.457 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.478 183134 INFO nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.501 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.878 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.880 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.880 183134 INFO nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Creating image(s)#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.881 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.881 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.882 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.903 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.959 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.959 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.960 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:05 np0005601977 nova_compute[183130]: 2026-01-30 09:28:05.971 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.023 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.023 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.048 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.049 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.050 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.079 183134 DEBUG nova.policy [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.092 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.092 183134 DEBUG nova.virt.disk.api [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.093 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.162 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.164 183134 DEBUG nova.virt.disk.api [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.164 183134 DEBUG nova.objects.instance [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid a9a1475a-af89-477b-bc8b-31a79fa63f3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.195 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.196 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Ensure instance console log exists: /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.197 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.197 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.198 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.558 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [{"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.583 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.584 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.584 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.585 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.585 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:06 np0005601977 nova_compute[183130]: 2026-01-30 09:28:06.586 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:28:07 np0005601977 nova_compute[183130]: 2026-01-30 09:28:07.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:07 np0005601977 nova_compute[183130]: 2026-01-30 09:28:07.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:07 np0005601977 nova_compute[183130]: 2026-01-30 09:28:07.735 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:08 np0005601977 nova_compute[183130]: 2026-01-30 09:28:08.443 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Successfully created port: 620ad6fc-218e-4453-8811-06bd2d12bb67 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:28:09 np0005601977 podman[214910]: 2026-01-30 09:28:09.869561783 +0000 UTC m=+0.085195512 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:28:09 np0005601977 podman[214911]: 2026-01-30 09:28:09.872184649 +0000 UTC m=+0.084385730 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 30 04:28:09 np0005601977 nova_compute[183130]: 2026-01-30 09:28:09.942 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:11 np0005601977 nova_compute[183130]: 2026-01-30 09:28:11.513 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Successfully updated port: 620ad6fc-218e-4453-8811-06bd2d12bb67 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:28:11 np0005601977 nova_compute[183130]: 2026-01-30 09:28:11.530 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:11 np0005601977 nova_compute[183130]: 2026-01-30 09:28:11.531 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:11 np0005601977 nova_compute[183130]: 2026-01-30 09:28:11.532 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:28:12 np0005601977 nova_compute[183130]: 2026-01-30 09:28:12.391 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:28:12 np0005601977 nova_compute[183130]: 2026-01-30 09:28:12.737 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:14 np0005601977 nova_compute[183130]: 2026-01-30 09:28:14.945 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.570 183134 DEBUG nova.compute.manager [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.571 183134 DEBUG nova.compute.manager [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing instance network info cache due to event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.571 183134 DEBUG oslo_concurrency.lockutils [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.849 183134 DEBUG nova.network.neutron [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updating instance_info_cache with network_info: [{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.904 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.905 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Instance network_info: |[{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.905 183134 DEBUG oslo_concurrency.lockutils [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.906 183134 DEBUG nova.network.neutron [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.909 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Start _get_guest_xml network_info=[{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.918 183134 WARNING nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.929 183134 DEBUG nova.virt.libvirt.host [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.930 183134 DEBUG nova.virt.libvirt.host [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.944 183134 DEBUG nova.virt.libvirt.host [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.945 183134 DEBUG nova.virt.libvirt.host [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.947 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.947 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.948 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.948 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.948 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.948 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.948 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.949 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.949 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.949 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.949 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.950 183134 DEBUG nova.virt.hardware [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.954 183134 DEBUG nova.virt.libvirt.vif [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=18,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNCq5HZPgINJROVJRLOqP2eaN821VUl6YKdWNtUZ9NbgWd7o4YI/tIyy4Hs0UiWxnKw5E/5LJHRl+c6TQFgJ4DtjJZbbvvVQ99OtYsa/oy7cAgv21uCmQWp+31SzPJJP0A==',key_name='tempest-TestSecurityGroupsBasicOps-2143582879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-xx3cbvri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:28:05Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=a9a1475a-af89-477b-bc8b-31a79fa63f3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.955 183134 DEBUG nova.network.os_vif_util [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.955 183134 DEBUG nova.network.os_vif_util [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.956 183134 DEBUG nova.objects.instance [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid a9a1475a-af89-477b-bc8b-31a79fa63f3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.977 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <uuid>a9a1475a-af89-477b-bc8b-31a79fa63f3e</uuid>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <name>instance-00000012</name>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981</nova:name>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:28:16</nova:creationTime>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        <nova:port uuid="620ad6fc-218e-4453-8811-06bd2d12bb67">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="serial">a9a1475a-af89-477b-bc8b-31a79fa63f3e</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="uuid">a9a1475a-af89-477b-bc8b-31a79fa63f3e</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.config"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:e3:61:d8"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <target dev="tap620ad6fc-21"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/console.log" append="off"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:28:16 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:28:16 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:28:16 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:28:16 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.978 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Preparing to wait for external event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.979 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.979 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.979 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.980 183134 DEBUG nova.virt.libvirt.vif [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=18,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNCq5HZPgINJROVJRLOqP2eaN821VUl6YKdWNtUZ9NbgWd7o4YI/tIyy4Hs0UiWxnKw5E/5LJHRl+c6TQFgJ4DtjJZbbvvVQ99OtYsa/oy7cAgv21uCmQWp+31SzPJJP0A==',key_name='tempest-TestSecurityGroupsBasicOps-2143582879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-xx3cbvri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:28:05Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=a9a1475a-af89-477b-bc8b-31a79fa63f3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.981 183134 DEBUG nova.network.os_vif_util [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.981 183134 DEBUG nova.network.os_vif_util [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.981 183134 DEBUG os_vif [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.982 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.982 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.982 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.983 183134 INFO nova.compute.manager [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Swapping old allocation on dict_keys(['eb11f67d-14b4-46ee-89fd-92936c45ed58']) held by migration 88445283-129b-4f87-a75d-ab640dcf2a52 for instance#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.986 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.987 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap620ad6fc-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.987 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap620ad6fc-21, col_values=(('external_ids', {'iface-id': '620ad6fc-218e-4453-8811-06bd2d12bb67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:61:d8', 'vm-uuid': 'a9a1475a-af89-477b-bc8b-31a79fa63f3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.988 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:16 np0005601977 NetworkManager[55565]: <info>  [1769765296.9896] manager: (tap620ad6fc-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.990 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.993 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:16 np0005601977 nova_compute[183130]: 2026-01-30 09:28:16.993 183134 INFO os_vif [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21')#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.034 183134 DEBUG nova.scheduler.client.report [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Overwriting current allocation {'allocations': {'5d3b70f5-f112-4478-a29a-fcced7bddfa0': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 19}}, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'consumer_generation': 1} on consumer 37aaa571-2821-4d88-b360-9f7b02c1aa1b move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.040 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.041 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.041 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:e3:61:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.041 183134 INFO nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Using config drive#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.483 183134 INFO nova.network.neutron [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating port 6a96c970-8213-4137-b6a7-4c31f1488ad5 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.758 183134 INFO nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Creating config drive at /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.config#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.762 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpln3g_mjh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.886 183134 DEBUG oslo_concurrency.processutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpln3g_mjh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:17 np0005601977 kernel: tap620ad6fc-21: entered promiscuous mode
Jan 30 04:28:17 np0005601977 NetworkManager[55565]: <info>  [1769765297.9615] manager: (tap620ad6fc-21): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 30 04:28:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:17Z|00112|binding|INFO|Claiming lport 620ad6fc-218e-4453-8811-06bd2d12bb67 for this chassis.
Jan 30 04:28:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:17Z|00113|binding|INFO|620ad6fc-218e-4453-8811-06bd2d12bb67: Claiming fa:16:3e:e3:61:d8 10.100.0.5
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.964 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.982 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:61:d8 10.100.0.5'], port_security=['fa:16:3e:e3:61:d8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a9a1475a-af89-477b-bc8b-31a79fa63f3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56316a99-761e-4baf-8191-d82570ee0e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288b11f4-9cad-44da-942a-7dd048ba32a1 5c5a5a2c-f482-4bc9-b63a-50670c65d485', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a42d4db-c73b-4bac-8cd6-22027f015570, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=620ad6fc-218e-4453-8811-06bd2d12bb67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.983 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 620ad6fc-218e-4453-8811-06bd2d12bb67 in datapath 56316a99-761e-4baf-8191-d82570ee0e52 bound to our chassis#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.985 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 56316a99-761e-4baf-8191-d82570ee0e52#033[00m
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.991 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.995 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[61b84bc7-8928-49c6-904d-16c4ec039b83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.996 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap56316a99-71 in ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:28:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:17Z|00114|binding|INFO|Setting lport 620ad6fc-218e-4453-8811-06bd2d12bb67 ovn-installed in OVS
Jan 30 04:28:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:17Z|00115|binding|INFO|Setting lport 620ad6fc-218e-4453-8811-06bd2d12bb67 up in Southbound
Jan 30 04:28:17 np0005601977 nova_compute[183130]: 2026-01-30 09:28:17.998 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.998 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap56316a99-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:28:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.998 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9996a3-f3c4-4088-a4d3-a237b5f32120]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:17 np0005601977 systemd-udevd[214990]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:17.999 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba2274e-498d-4058-8bc2-7279b7a72ee6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 systemd-machined[154431]: New machine qemu-9-instance-00000012.
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.007 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[59d858bc-db57-4893-a4f4-5f1d646513b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 NetworkManager[55565]: <info>  [1769765298.0122] device (tap620ad6fc-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:28:18 np0005601977 NetworkManager[55565]: <info>  [1769765298.0133] device (tap620ad6fc-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:28:18 np0005601977 systemd[1]: Started Virtual Machine qemu-9-instance-00000012.
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.018 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a1409cdb-22df-4160-8d47-d441c8781449]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.039 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[11d04f44-6f15-40bb-a7c4-421e01f28c5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 NetworkManager[55565]: <info>  [1769765298.0450] manager: (tap56316a99-70): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.043 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[76356f9a-ccf8-47b1-836e-86a586a22281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.073 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[545255eb-48b0-408e-a595-886da8d05e20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.077 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[7171eb44-2d35-4922-808e-bfccd2a05e2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 podman[214963]: 2026-01-30 09:28:18.091201971 +0000 UTC m=+0.134525204 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible)
Jan 30 04:28:18 np0005601977 podman[214964]: 2026-01-30 09:28:18.094410693 +0000 UTC m=+0.137869420 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:28:18 np0005601977 NetworkManager[55565]: <info>  [1769765298.0964] device (tap56316a99-70): carrier: link connected
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.100 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[849dbf87-3d52-4583-a85d-d331aaf07a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.113 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[110d736b-4818-4e61-a6ce-64d98124fd13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56316a99-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:11:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382943, 'reachable_time': 31696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215041, 'error': None, 'target': 'ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.121 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5e56ce62-c6a5-48c5-b31d-8a1eeab8bf93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:1124'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382943, 'tstamp': 382943}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215042, 'error': None, 'target': 'ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.139 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[60ab8b9c-9aac-4237-8954-2733315b4037]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56316a99-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:11:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382943, 'reachable_time': 31696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215043, 'error': None, 'target': 'ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.165 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0d30ba1a-0cd8-4ac4-a6e1-3fe25be727f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.205 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e7900ea2-2890-40bb-8c77-f938470ccca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.208 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56316a99-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.209 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.210 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56316a99-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.212 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:18 np0005601977 kernel: tap56316a99-70: entered promiscuous mode
Jan 30 04:28:18 np0005601977 NetworkManager[55565]: <info>  [1769765298.2134] manager: (tap56316a99-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.214 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.219 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap56316a99-70, col_values=(('external_ids', {'iface-id': '35ddecda-f2fa-473b-b271-24bd26df329a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:18Z|00116|binding|INFO|Releasing lport 35ddecda-f2fa-473b-b271-24bd26df329a from this chassis (sb_readonly=0)
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.220 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.223 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/56316a99-761e-4baf-8191-d82570ee0e52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/56316a99-761e-4baf-8191-d82570ee0e52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.224 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[38ee48b4-7aaa-4051-852b-2b3c0e372419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.225 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.225 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-56316a99-761e-4baf-8191-d82570ee0e52
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/56316a99-761e-4baf-8191-d82570ee0e52.pid.haproxy
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 56316a99-761e-4baf-8191-d82570ee0e52
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:28:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:18.227 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52', 'env', 'PROCESS_TAG=haproxy-56316a99-761e-4baf-8191-d82570ee0e52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/56316a99-761e-4baf-8191-d82570ee0e52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.391 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765298.390999, a9a1475a-af89-477b-bc8b-31a79fa63f3e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.391 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] VM Started (Lifecycle Event)#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.447 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.452 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765298.3912094, a9a1475a-af89-477b-bc8b-31a79fa63f3e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.452 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.564 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.567 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.593 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:28:18 np0005601977 podman[215082]: 2026-01-30 09:28:18.597342132 +0000 UTC m=+0.051613787 container create 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:28:18 np0005601977 systemd[1]: Started libpod-conmon-94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300.scope.
Jan 30 04:28:18 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:28:18 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b2c762e855db3108fe33561342837121041701c737733ac56d60e5846054f1d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:28:18 np0005601977 podman[215082]: 2026-01-30 09:28:18.660036707 +0000 UTC m=+0.114308362 container init 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:28:18 np0005601977 podman[215082]: 2026-01-30 09:28:18.664412893 +0000 UTC m=+0.118684538 container start 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 30 04:28:18 np0005601977 podman[215082]: 2026-01-30 09:28:18.571126088 +0000 UTC m=+0.025397753 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:28:18 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [NOTICE]   (215101) : New worker (215103) forked
Jan 30 04:28:18 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [NOTICE]   (215101) : Loading success.
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.945 183134 DEBUG nova.compute.manager [req-bd1a3123-4399-43a4-8963-0c68657d56b9 req-929e3a5d-355a-4d50-b1c9-47450ffd0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.945 183134 DEBUG oslo_concurrency.lockutils [req-bd1a3123-4399-43a4-8963-0c68657d56b9 req-929e3a5d-355a-4d50-b1c9-47450ffd0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.946 183134 DEBUG oslo_concurrency.lockutils [req-bd1a3123-4399-43a4-8963-0c68657d56b9 req-929e3a5d-355a-4d50-b1c9-47450ffd0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.946 183134 DEBUG oslo_concurrency.lockutils [req-bd1a3123-4399-43a4-8963-0c68657d56b9 req-929e3a5d-355a-4d50-b1c9-47450ffd0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.946 183134 DEBUG nova.compute.manager [req-bd1a3123-4399-43a4-8963-0c68657d56b9 req-929e3a5d-355a-4d50-b1c9-47450ffd0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Processing event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.947 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.950 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765298.950769, a9a1475a-af89-477b-bc8b-31a79fa63f3e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.951 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.953 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.957 183134 INFO nova.virt.libvirt.driver [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Instance spawned successfully.#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.957 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.990 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:18 np0005601977 nova_compute[183130]: 2026-01-30 09:28:18.993 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.001 183134 DEBUG nova.network.neutron [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updated VIF entry in instance network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.002 183134 DEBUG nova.network.neutron [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updating instance_info_cache with network_info: [{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.007 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.007 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.007 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.008 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.008 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.009 183134 DEBUG nova.virt.libvirt.driver [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.013 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.013 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.013 183134 DEBUG nova.network.neutron [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.041 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.050 183134 DEBUG oslo_concurrency.lockutils [req-f933a41f-0503-4287-b90d-90ce599d7dc0 req-7cc44976-1a86-4f67-bc92-7998784a3eea dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.098 183134 INFO nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Took 13.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.098 183134 DEBUG nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.210 183134 INFO nova.compute.manager [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Took 14.16 seconds to build instance.#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.236 183134 DEBUG oslo_concurrency.lockutils [None req-60355d98-9b7b-45f2-9bae-2a7b8cc889fd 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.446 183134 DEBUG nova.compute.manager [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.447 183134 DEBUG nova.compute.manager [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing instance network info cache due to event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.447 183134 DEBUG oslo_concurrency.lockutils [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:19 np0005601977 nova_compute[183130]: 2026-01-30 09:28:19.946 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:21 np0005601977 nova_compute[183130]: 2026-01-30 09:28:21.989 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.364 183134 DEBUG nova.network.neutron [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.389 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.389 183134 DEBUG nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.393 183134 DEBUG oslo_concurrency.lockutils [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.394 183134 DEBUG nova.network.neutron [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.403 183134 DEBUG nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start _get_guest_xml network_info=[{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.415 183134 WARNING nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.425 183134 DEBUG nova.virt.libvirt.host [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.425 183134 DEBUG nova.virt.libvirt.host [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.436 183134 DEBUG nova.virt.libvirt.host [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.436 183134 DEBUG nova.virt.libvirt.host [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.438 183134 DEBUG nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.438 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.439 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.439 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.439 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.439 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.440 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.440 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.440 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.441 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.441 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.441 183134 DEBUG nova.virt.hardware [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.441 183134 DEBUG nova.objects.instance [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 37aaa571-2821-4d88-b360-9f7b02c1aa1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.476 183134 DEBUG oslo_concurrency.processutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.518 183134 DEBUG oslo_concurrency.processutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.519 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.519 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.520 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.522 183134 DEBUG nova.virt.libvirt.vif [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:28:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:28:15Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.522 183134 DEBUG nova.network.os_vif_util [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.523 183134 DEBUG nova.network.os_vif_util [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.526 183134 DEBUG nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <uuid>37aaa571-2821-4d88-b360-9f7b02c1aa1b</uuid>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <name>instance-0000000c</name>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-809581554</nova:name>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:28:22</nova:creationTime>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        <nova:port uuid="6a96c970-8213-4137-b6a7-4c31f1488ad5">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="serial">37aaa571-2821-4d88-b360-9f7b02c1aa1b</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="uuid">37aaa571-2821-4d88-b360-9f7b02c1aa1b</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/disk.config"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:54:7d:f1"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <target dev="tap6a96c970-82"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b/console.log" append="off"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <input type="keyboard" bus="usb"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:28:22 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:28:22 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:28:22 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:28:22 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.527 183134 DEBUG nova.compute.manager [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Preparing to wait for external event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.528 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.529 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.529 183134 DEBUG oslo_concurrency.lockutils [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.530 183134 DEBUG nova.virt.libvirt.vif [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:28:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:28:15Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.530 183134 DEBUG nova.network.os_vif_util [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.531 183134 DEBUG nova.network.os_vif_util [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.532 183134 DEBUG os_vif [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.533 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.533 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.534 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.536 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.537 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a96c970-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.537 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a96c970-82, col_values=(('external_ids', {'iface-id': '6a96c970-8213-4137-b6a7-4c31f1488ad5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:7d:f1', 'vm-uuid': '37aaa571-2821-4d88-b360-9f7b02c1aa1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.579 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.5810] manager: (tap6a96c970-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.582 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.585 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.586 183134 INFO os_vif [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82')#033[00m
Jan 30 04:28:22 np0005601977 kernel: tap6a96c970-82: entered promiscuous mode
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.6518] manager: (tap6a96c970-82): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.654 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 systemd-udevd[215129]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.658 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00117|binding|INFO|Claiming lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 for this chassis.
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00118|binding|INFO|6a96c970-8213-4137-b6a7-4c31f1488ad5: Claiming fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.6713] device (tap6a96c970-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.6736] device (tap6a96c970-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:28:22 np0005601977 systemd-machined[154431]: New machine qemu-10-instance-0000000c.
Jan 30 04:28:22 np0005601977 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.7091] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.7097] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.708 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.713 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.714 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 bound to our chassis#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.716 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf368f4-a96c-4392-8db3-50f404160fc3#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.725 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5d7354ac-4f58-4071-a10a-d47ba2ed98ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.726 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf368f4-a1 in ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.727 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf368f4-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.727 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8a4e89-a534-41c2-a351-94029bf79292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.727 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[087d73fa-c959-4bde-afdd-d851122230e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.735 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c03424-b2a8-4bae-85e0-e834f6260566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.757 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d142e1af-0e16-41a2-b0bb-92ae7c48b894]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.775 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[014e7817-af63-4ad3-805a-12ae8a93d6a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.779 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8389bb43-a9db-45c2-94d6-bd4c7e3b1ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.7808] manager: (tap9cf368f4-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.780 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 systemd-udevd[215134]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00119|binding|INFO|Releasing lport 35ddecda-f2fa-473b-b271-24bd26df329a from this chassis (sb_readonly=0)
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00120|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.804 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd4bad8-7739-4429-ba87-268a763885f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.808 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[4e5988b9-073e-4adc-9234-8c31e41ce72e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.8239] device (tap9cf368f4-a0): carrier: link connected
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.826 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e326ea9c-4213-43a0-999d-3a63d43d0424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.829 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00121|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 ovn-installed in OVS
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00122|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 up in Southbound
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.834 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.839 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[580d24c5-fbaa-4542-985a-d31919160a54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf368f4-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:13:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383416, 'reachable_time': 30913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215165, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.851 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[005f4599-39e8-488e-b9ca-e6bbfddbb1f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:13e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383416, 'tstamp': 383416}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215166, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.862 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[25484147-e814-457b-bf6f-5ef88800a6d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf368f4-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:13:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383416, 'reachable_time': 30913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215167, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.887 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b3bb3801-d659-4a15-8fa5-81f42c31827a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.922 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[204226e2-32ac-437a-8538-6edc5da01d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.923 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf368f4-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.923 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.924 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf368f4-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.925 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 kernel: tap9cf368f4-a0: entered promiscuous mode
Jan 30 04:28:22 np0005601977 NetworkManager[55565]: <info>  [1769765302.9277] manager: (tap9cf368f4-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.927 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf368f4-a0, col_values=(('external_ids', {'iface-id': '0c4430cb-14c7-41cc-8074-5659c58e2db6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:22Z|00123|binding|INFO|Releasing lport 0c4430cb-14c7-41cc-8074-5659c58e2db6 from this chassis (sb_readonly=0)
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.929 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.930 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.930 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a2e4f6-8648-4b00-b0fd-6be7724d4c95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.931 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-9cf368f4-a96c-4392-8db3-50f404160fc3
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/9cf368f4-a96c-4392-8db3-50f404160fc3.pid.haproxy
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 9cf368f4-a96c-4392-8db3-50f404160fc3
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:28:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:22.932 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'env', 'PROCESS_TAG=haproxy-9cf368f4-a96c-4392-8db3-50f404160fc3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf368f4-a96c-4392-8db3-50f404160fc3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:28:22 np0005601977 nova_compute[183130]: 2026-01-30 09:28:22.933 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.010 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765303.010256, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.011 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Started (Lifecycle Event)#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.028 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.032 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765303.0113738, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.032 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.055 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.060 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.082 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 30 04:28:23 np0005601977 podman[215206]: 2026-01-30 09:28:23.273320681 +0000 UTC m=+0.053964624 container create e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 30 04:28:23 np0005601977 systemd[1]: Started libpod-conmon-e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9.scope.
Jan 30 04:28:23 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:28:23 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ddd7920348df022094c000a4b30a0dbb32035ab58f931faacbd2ea3573dd47e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:28:23 np0005601977 podman[215206]: 2026-01-30 09:28:23.251295437 +0000 UTC m=+0.031939390 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:28:23 np0005601977 podman[215206]: 2026-01-30 09:28:23.350840513 +0000 UTC m=+0.131484556 container init e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 30 04:28:23 np0005601977 podman[215206]: 2026-01-30 09:28:23.355066265 +0000 UTC m=+0.135710208 container start e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:28:23 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [NOTICE]   (215244) : New worker (215252) forked
Jan 30 04:28:23 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [NOTICE]   (215244) : Loading success.
Jan 30 04:28:23 np0005601977 podman[215221]: 2026-01-30 09:28:23.413343242 +0000 UTC m=+0.096852999 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.813 183134 DEBUG nova.compute.manager [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.814 183134 DEBUG oslo_concurrency.lockutils [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.814 183134 DEBUG oslo_concurrency.lockutils [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.815 183134 DEBUG oslo_concurrency.lockutils [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.816 183134 DEBUG nova.compute.manager [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] No waiting events found dispatching network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:23 np0005601977 nova_compute[183130]: 2026-01-30 09:28:23.816 183134 WARNING nova.compute.manager [req-951415df-2e76-446a-bfb1-dc8a22586993 req-493068bc-e214-458e-bd53-d0bd30e6ec2d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received unexpected event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.323 183134 DEBUG nova.compute.manager [req-f356a186-b5c2-4d64-aaf8-25ce8c7e221c req-8676154f-fe36-4421-af3c-686080f76b4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.323 183134 DEBUG oslo_concurrency.lockutils [req-f356a186-b5c2-4d64-aaf8-25ce8c7e221c req-8676154f-fe36-4421-af3c-686080f76b4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.324 183134 DEBUG oslo_concurrency.lockutils [req-f356a186-b5c2-4d64-aaf8-25ce8c7e221c req-8676154f-fe36-4421-af3c-686080f76b4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.324 183134 DEBUG oslo_concurrency.lockutils [req-f356a186-b5c2-4d64-aaf8-25ce8c7e221c req-8676154f-fe36-4421-af3c-686080f76b4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.325 183134 DEBUG nova.compute.manager [req-f356a186-b5c2-4d64-aaf8-25ce8c7e221c req-8676154f-fe36-4421-af3c-686080f76b4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Processing event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.326 183134 DEBUG nova.compute.manager [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.332 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765304.3299189, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.332 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.338 183134 INFO nova.virt.libvirt.driver [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance running successfully.#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.339 183134 DEBUG nova.virt.libvirt.driver [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.377 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.379 183134 DEBUG nova.network.neutron [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updated VIF entry in instance network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.380 183134 DEBUG nova.network.neutron [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.385 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.424 183134 DEBUG oslo_concurrency.lockutils [req-9249f4f3-fbf6-42f4-9ecd-8017c8e6ca71 req-a0948eab-94f3-482c-a1b8-df66e4b9bf1d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.426 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.451 183134 INFO nova.compute.manager [None req-01cf4144-0132-433a-90c7-3c3126778f82 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance to original state: 'active'#033[00m
Jan 30 04:28:24 np0005601977 nova_compute[183130]: 2026-01-30 09:28:24.990 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.517 183134 DEBUG nova.compute.manager [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.518 183134 DEBUG oslo_concurrency.lockutils [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.518 183134 DEBUG oslo_concurrency.lockutils [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.518 183134 DEBUG oslo_concurrency.lockutils [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.518 183134 DEBUG nova.compute.manager [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.519 183134 WARNING nova.compute.manager [req-4ea64fd3-f92d-4a5f-ac90-79b70e686a57 req-e004dd54-806f-4082-b297-210eecfaee7d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:28:27 np0005601977 nova_compute[183130]: 2026-01-30 09:28:27.579 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:29Z|00124|binding|INFO|Releasing lport 0c4430cb-14c7-41cc-8074-5659c58e2db6 from this chassis (sb_readonly=0)
Jan 30 04:28:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:29Z|00125|binding|INFO|Releasing lport 35ddecda-f2fa-473b-b271-24bd26df329a from this chassis (sb_readonly=0)
Jan 30 04:28:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:29Z|00126|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:28:29 np0005601977 nova_compute[183130]: 2026-01-30 09:28:29.875 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:29.971 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:29.972 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:28:29 np0005601977 nova_compute[183130]: 2026-01-30 09:28:29.972 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:29.973 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:29 np0005601977 nova_compute[183130]: 2026-01-30 09:28:29.991 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:30Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e3:61:d8 10.100.0.5
Jan 30 04:28:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:30Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:61:d8 10.100.0.5
Jan 30 04:28:31 np0005601977 podman[215284]: 2026-01-30 09:28:31.822050921 +0000 UTC m=+0.043202955 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:28:32 np0005601977 nova_compute[183130]: 2026-01-30 09:28:32.582 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:34 np0005601977 nova_compute[183130]: 2026-01-30 09:28:34.992 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:35Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:7d:f1 10.100.0.9
Jan 30 04:28:36 np0005601977 nova_compute[183130]: 2026-01-30 09:28:36.558 183134 DEBUG nova.compute.manager [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:36 np0005601977 nova_compute[183130]: 2026-01-30 09:28:36.559 183134 DEBUG nova.compute.manager [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing instance network info cache due to event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:28:36 np0005601977 nova_compute[183130]: 2026-01-30 09:28:36.559 183134 DEBUG oslo_concurrency.lockutils [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:36 np0005601977 nova_compute[183130]: 2026-01-30 09:28:36.559 183134 DEBUG oslo_concurrency.lockutils [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:36 np0005601977 nova_compute[183130]: 2026-01-30 09:28:36.559 183134 DEBUG nova.network.neutron [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:28:37 np0005601977 nova_compute[183130]: 2026-01-30 09:28:37.584 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:37 np0005601977 nova_compute[183130]: 2026-01-30 09:28:37.682 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:39 np0005601977 nova_compute[183130]: 2026-01-30 09:28:39.994 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:40 np0005601977 podman[215325]: 2026-01-30 09:28:40.832853536 +0000 UTC m=+0.049088234 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:28:40 np0005601977 podman[215324]: 2026-01-30 09:28:40.855953501 +0000 UTC m=+0.072353714 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Jan 30 04:28:41 np0005601977 nova_compute[183130]: 2026-01-30 09:28:41.491 183134 DEBUG nova.network.neutron [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updated VIF entry in instance network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:28:41 np0005601977 nova_compute[183130]: 2026-01-30 09:28:41.492 183134 DEBUG nova.network.neutron [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updating instance_info_cache with network_info: [{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:41 np0005601977 nova_compute[183130]: 2026-01-30 09:28:41.558 183134 DEBUG oslo_concurrency.lockutils [req-3d3981f7-9436-4c7c-9bad-2754c176e024 req-68e4aa25-deb8-49cf-904f-9750e61b2050 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:41Z|00127|binding|INFO|Releasing lport 0c4430cb-14c7-41cc-8074-5659c58e2db6 from this chassis (sb_readonly=0)
Jan 30 04:28:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:41Z|00128|binding|INFO|Releasing lport 35ddecda-f2fa-473b-b271-24bd26df329a from this chassis (sb_readonly=0)
Jan 30 04:28:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:41Z|00129|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:28:41 np0005601977 nova_compute[183130]: 2026-01-30 09:28:41.796 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:42 np0005601977 nova_compute[183130]: 2026-01-30 09:28:42.588 183134 INFO nova.compute.manager [None req-2130912b-b87a-416b-b623-4fed262e283c 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Get console output#033[00m
Jan 30 04:28:42 np0005601977 nova_compute[183130]: 2026-01-30 09:28:42.588 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:42 np0005601977 nova_compute[183130]: 2026-01-30 09:28:42.597 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:28:44 np0005601977 nova_compute[183130]: 2026-01-30 09:28:44.998 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.000 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.876 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.877 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.878 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.878 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.878 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.879 183134 INFO nova.compute.manager [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Terminating instance#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.880 183134 DEBUG nova.compute.manager [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:28:45 np0005601977 kernel: tap620ad6fc-21 (unregistering): left promiscuous mode
Jan 30 04:28:45 np0005601977 NetworkManager[55565]: <info>  [1769765325.9066] device (tap620ad6fc-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.912 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:45Z|00130|binding|INFO|Releasing lport 620ad6fc-218e-4453-8811-06bd2d12bb67 from this chassis (sb_readonly=0)
Jan 30 04:28:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:45Z|00131|binding|INFO|Setting lport 620ad6fc-218e-4453-8811-06bd2d12bb67 down in Southbound
Jan 30 04:28:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:45Z|00132|binding|INFO|Removing iface tap620ad6fc-21 ovn-installed in OVS
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.915 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:45 np0005601977 nova_compute[183130]: 2026-01-30 09:28:45.919 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:45 np0005601977 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 30 04:28:45 np0005601977 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000012.scope: Consumed 12.143s CPU time.
Jan 30 04:28:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:45.952 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:61:d8 10.100.0.5'], port_security=['fa:16:3e:e3:61:d8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a9a1475a-af89-477b-bc8b-31a79fa63f3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56316a99-761e-4baf-8191-d82570ee0e52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288b11f4-9cad-44da-942a-7dd048ba32a1 5c5a5a2c-f482-4bc9-b63a-50670c65d485', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a42d4db-c73b-4bac-8cd6-22027f015570, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=620ad6fc-218e-4453-8811-06bd2d12bb67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:45.953 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 620ad6fc-218e-4453-8811-06bd2d12bb67 in datapath 56316a99-761e-4baf-8191-d82570ee0e52 unbound from our chassis#033[00m
Jan 30 04:28:45 np0005601977 systemd-machined[154431]: Machine qemu-9-instance-00000012 terminated.
Jan 30 04:28:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:45.955 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 56316a99-761e-4baf-8191-d82570ee0e52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:28:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:45.957 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[541bf42f-3eba-40c6-bf2b-74c12aadbcb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:45.957 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52 namespace which is not needed anymore#033[00m
Jan 30 04:28:46 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [NOTICE]   (215101) : haproxy version is 2.8.14-c23fe91
Jan 30 04:28:46 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [NOTICE]   (215101) : path to executable is /usr/sbin/haproxy
Jan 30 04:28:46 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [WARNING]  (215101) : Exiting Master process...
Jan 30 04:28:46 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [ALERT]    (215101) : Current worker (215103) exited with code 143 (Terminated)
Jan 30 04:28:46 np0005601977 neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52[215097]: [WARNING]  (215101) : All workers exited. Exiting... (0)
Jan 30 04:28:46 np0005601977 systemd[1]: libpod-94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300.scope: Deactivated successfully.
Jan 30 04:28:46 np0005601977 podman[215385]: 2026-01-30 09:28:46.082971973 +0000 UTC m=+0.046536950 container died 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:28:46 np0005601977 systemd[1]: var-lib-containers-storage-overlay-8b2c762e855db3108fe33561342837121041701c737733ac56d60e5846054f1d-merged.mount: Deactivated successfully.
Jan 30 04:28:46 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300-userdata-shm.mount: Deactivated successfully.
Jan 30 04:28:46 np0005601977 podman[215385]: 2026-01-30 09:28:46.11551725 +0000 UTC m=+0.079082217 container cleanup 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:28:46 np0005601977 systemd[1]: libpod-conmon-94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300.scope: Deactivated successfully.
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.134 183134 INFO nova.virt.libvirt.driver [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Instance destroyed successfully.#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.135 183134 DEBUG nova.objects.instance [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid a9a1475a-af89-477b-bc8b-31a79fa63f3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:46 np0005601977 podman[215427]: 2026-01-30 09:28:46.171732639 +0000 UTC m=+0.041093524 container remove 94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.174 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ec9406-8273-4b82-8460-f857c0241464]: (4, ('Fri Jan 30 09:28:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52 (94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300)\n94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300\nFri Jan 30 09:28:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52 (94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300)\n94d3f8535257d20d54e3fc18adc5b502e3f7e0f620b9f3fda77cdb662791a300\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.176 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[20f41171-dfd2-443a-bad9-3d9c1e8018a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.177 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56316a99-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.203 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:46 np0005601977 kernel: tap56316a99-70: left promiscuous mode
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.210 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.211 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.213 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb76fae-9987-44cc-905e-612f3cb81ac5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.224 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[14fa7a96-599e-46c7-bb9c-95714e494981]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.225 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b9767bd7-097b-45af-9d70-ae1500a7e7a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.235 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c87aba75-b428-470f-9f2f-608204721a87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382937, 'reachable_time': 27735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215453, 'error': None, 'target': 'ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.237 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-56316a99-761e-4baf-8191-d82570ee0e52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:28:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:46.237 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[cc913bc5-0531-4188-bbf9-f2e0e77d56ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:46 np0005601977 systemd[1]: run-netns-ovnmeta\x2d56316a99\x2d761e\x2d4baf\x2d8191\x2dd82570ee0e52.mount: Deactivated successfully.
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.520 183134 DEBUG nova.compute.manager [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.521 183134 DEBUG nova.compute.manager [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing instance network info cache due to event network-changed-620ad6fc-218e-4453-8811-06bd2d12bb67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.522 183134 DEBUG oslo_concurrency.lockutils [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.522 183134 DEBUG oslo_concurrency.lockutils [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:46 np0005601977 nova_compute[183130]: 2026-01-30 09:28:46.523 183134 DEBUG nova.network.neutron [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Refreshing network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.043 183134 DEBUG nova.virt.libvirt.vif [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-266238981',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=18,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNCq5HZPgINJROVJRLOqP2eaN821VUl6YKdWNtUZ9NbgWd7o4YI/tIyy4Hs0UiWxnKw5E/5LJHRl+c6TQFgJ4DtjJZbbvvVQ99OtYsa/oy7cAgv21uCmQWp+31SzPJJP0A==',key_name='tempest-TestSecurityGroupsBasicOps-2143582879',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:28:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-xx3cbvri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:28:19Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=a9a1475a-af89-477b-bc8b-31a79fa63f3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.044 183134 DEBUG nova.network.os_vif_util [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.045 183134 DEBUG nova.network.os_vif_util [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.046 183134 DEBUG os_vif [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.048 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.049 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap620ad6fc-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.051 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.053 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.056 183134 INFO os_vif [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:61:d8,bridge_name='br-int',has_traffic_filtering=True,id=620ad6fc-218e-4453-8811-06bd2d12bb67,network=Network(56316a99-761e-4baf-8191-d82570ee0e52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap620ad6fc-21')#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.057 183134 INFO nova.virt.libvirt.driver [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Deleting instance files /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e_del#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.059 183134 INFO nova.virt.libvirt.driver [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Deletion of /var/lib/nova/instances/a9a1475a-af89-477b-bc8b-31a79fa63f3e_del complete#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.176 183134 INFO nova.compute.manager [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Took 1.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.178 183134 DEBUG oslo.service.loopingcall [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.178 183134 DEBUG nova.compute.manager [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:28:47 np0005601977 nova_compute[183130]: 2026-01-30 09:28:47.179 183134 DEBUG nova.network.neutron [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.718 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.719 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.720 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.721 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.721 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.724 183134 INFO nova.compute.manager [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Terminating instance#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.726 183134 DEBUG nova.compute.manager [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:28:48 np0005601977 kernel: tap6a96c970-82 (unregistering): left promiscuous mode
Jan 30 04:28:48 np0005601977 NetworkManager[55565]: <info>  [1769765328.7552] device (tap6a96c970-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:28:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:48Z|00133|binding|INFO|Releasing lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 from this chassis (sb_readonly=0)
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.763 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:48Z|00134|binding|INFO|Setting lport 6a96c970-8213-4137-b6a7-4c31f1488ad5 down in Southbound
Jan 30 04:28:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:48Z|00135|binding|INFO|Removing iface tap6a96c970-82 ovn-installed in OVS
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.765 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.769 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:48 np0005601977 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 30 04:28:48 np0005601977 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 12.369s CPU time.
Jan 30 04:28:48 np0005601977 systemd-machined[154431]: Machine qemu-10-instance-0000000c terminated.
Jan 30 04:28:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:48.805 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:f1 10.100.0.9'], port_security=['fa:16:3e:54:7d:f1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '37aaa571-2821-4d88-b360-9f7b02c1aa1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf368f4-a96c-4392-8db3-50f404160fc3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '12', 'neutron:security_group_ids': '2b8909af-505c-44a2-86bd-406e9cde5945', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2106cc88-5033-406a-bbeb-096c7422d7cf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6a96c970-8213-4137-b6a7-4c31f1488ad5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:48.807 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6a96c970-8213-4137-b6a7-4c31f1488ad5 in datapath 9cf368f4-a96c-4392-8db3-50f404160fc3 unbound from our chassis#033[00m
Jan 30 04:28:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:48.809 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf368f4-a96c-4392-8db3-50f404160fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:28:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:48.810 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e95b43-d2a6-493b-b97d-5ef6c139b993]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:48.811 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 namespace which is not needed anymore#033[00m
Jan 30 04:28:48 np0005601977 podman[215459]: 2026-01-30 09:28:48.837799093 +0000 UTC m=+0.059326409 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:28:48 np0005601977 podman[215454]: 2026-01-30 09:28:48.839501762 +0000 UTC m=+0.061047959 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:28:48 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [NOTICE]   (215244) : haproxy version is 2.8.14-c23fe91
Jan 30 04:28:48 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [NOTICE]   (215244) : path to executable is /usr/sbin/haproxy
Jan 30 04:28:48 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [WARNING]  (215244) : Exiting Master process...
Jan 30 04:28:48 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [ALERT]    (215244) : Current worker (215252) exited with code 143 (Terminated)
Jan 30 04:28:48 np0005601977 neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3[215222]: [WARNING]  (215244) : All workers exited. Exiting... (0)
Jan 30 04:28:48 np0005601977 systemd[1]: libpod-e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9.scope: Deactivated successfully.
Jan 30 04:28:48 np0005601977 podman[215517]: 2026-01-30 09:28:48.92730871 +0000 UTC m=+0.046527591 container died e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:28:48 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9-userdata-shm.mount: Deactivated successfully.
Jan 30 04:28:48 np0005601977 systemd[1]: var-lib-containers-storage-overlay-6ddd7920348df022094c000a4b30a0dbb32035ab58f931faacbd2ea3573dd47e-merged.mount: Deactivated successfully.
Jan 30 04:28:48 np0005601977 podman[215517]: 2026-01-30 09:28:48.963756489 +0000 UTC m=+0.082975340 container cleanup e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:28:48 np0005601977 systemd[1]: libpod-conmon-e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9.scope: Deactivated successfully.
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.989 183134 INFO nova.virt.libvirt.driver [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance destroyed successfully.#033[00m
Jan 30 04:28:48 np0005601977 nova_compute[183130]: 2026-01-30 09:28:48.990 183134 DEBUG nova.objects.instance [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 37aaa571-2821-4d88-b360-9f7b02c1aa1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:49 np0005601977 podman[215552]: 2026-01-30 09:28:49.015105047 +0000 UTC m=+0.035195144 container remove e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.019 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[becbeb3d-1d03-4dc8-b40b-e7e6ad633518]: (4, ('Fri Jan 30 09:28:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 (e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9)\ne7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9\nFri Jan 30 09:28:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 (e7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9)\ne7da71ced617ef03dfd74c9b71323b45cfb72b0dbb983421f00983177bd670d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.020 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c49f4dc0-9a0a-44cd-b5ab-d857646d2ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.021 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf368f4-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.022 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:49 np0005601977 kernel: tap9cf368f4-a0: left promiscuous mode
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.029 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.031 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4e944a2e-2e57-4581-8de1-f3311a0e9601]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.047 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[475120ab-3ae4-4813-913f-fe581630bd20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.048 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c79ce742-6d24-4b1b-9811-c25ecfe5a17c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.062 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe2c33b-2d55-49e6-8b57-5c02c4723eb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383411, 'reachable_time': 43634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215584, 'error': None, 'target': 'ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.063 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf368f4-a96c-4392-8db3-50f404160fc3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:28:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:49.064 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccae23d-64d9-49e1-bf56-bbcd93fdc504]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:28:49 np0005601977 systemd[1]: run-netns-ovnmeta\x2d9cf368f4\x2da96c\x2d4392\x2d8db3\x2d50f404160fc3.mount: Deactivated successfully.
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.104 183134 DEBUG nova.compute.manager [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.105 183134 DEBUG nova.compute.manager [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing instance network info cache due to event network-changed-6a96c970-8213-4137-b6a7-4c31f1488ad5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.105 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.105 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.105 183134 DEBUG nova.network.neutron [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Refreshing network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.125 183134 DEBUG nova.virt.libvirt.vif [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:26:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-809581554',display_name='tempest-TestNetworkAdvancedServerOps-server-809581554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-809581554',id=12,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFkWWUcp4/hru5LqJv27yDpf+1+iRsZOi/M8GbB/5I7iXHwxefzcmFLbcQt/GAvJQU8x8sEPj2RwtuV5cYtwmvilMTvuGdMtuc0URBoOB56Fvi6bvlKiWYrL++0Kht1i5g==',key_name='tempest-TestNetworkAdvancedServerOps-1685274608',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:28:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-hjlly9q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:28:24Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=37aaa571-2821-4d88-b360-9f7b02c1aa1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.125 183134 DEBUG nova.network.os_vif_util [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.125 183134 DEBUG nova.network.os_vif_util [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.126 183134 DEBUG os_vif [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.127 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a96c970-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.128 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.131 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.134 183134 INFO os_vif [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:f1,bridge_name='br-int',has_traffic_filtering=True,id=6a96c970-8213-4137-b6a7-4c31f1488ad5,network=Network(9cf368f4-a96c-4392-8db3-50f404160fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a96c970-82')#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.134 183134 INFO nova.virt.libvirt.driver [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Deleting instance files /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_del#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.137 183134 INFO nova.virt.libvirt.driver [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Deletion of /var/lib/nova/instances/37aaa571-2821-4d88-b360-9f7b02c1aa1b_del complete#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.205 183134 INFO nova.compute.manager [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.206 183134 DEBUG oslo.service.loopingcall [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.206 183134 DEBUG nova.compute.manager [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.207 183134 DEBUG nova.network.neutron [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.450 183134 DEBUG nova.network.neutron [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.694 183134 INFO nova.compute.manager [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Took 2.51 seconds to deallocate network for instance.#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.798 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.799 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.903 183134 DEBUG nova.compute.provider_tree [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:28:49 np0005601977 nova_compute[183130]: 2026-01-30 09:28:49.931 183134 DEBUG nova.scheduler.client.report [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.002 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.029 183134 DEBUG nova.compute.manager [req-5b9bcaec-98f9-40d7-85df-4c93be7bf5ba req-0760dfa8-a6ab-4e9f-a9b3-41190a8661d3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-deleted-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.041 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.144 183134 DEBUG nova.network.neutron [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updated VIF entry in instance network info cache for port 620ad6fc-218e-4453-8811-06bd2d12bb67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.144 183134 DEBUG nova.network.neutron [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Updating instance_info_cache with network_info: [{"id": "620ad6fc-218e-4453-8811-06bd2d12bb67", "address": "fa:16:3e:e3:61:d8", "network": {"id": "56316a99-761e-4baf-8191-d82570ee0e52", "bridge": "br-int", "label": "tempest-network-smoke--572763736", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap620ad6fc-21", "ovs_interfaceid": "620ad6fc-218e-4453-8811-06bd2d12bb67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.171 183134 INFO nova.scheduler.client.report [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance a9a1475a-af89-477b-bc8b-31a79fa63f3e#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.198 183134 DEBUG oslo_concurrency.lockutils [req-9ac2a8a9-f78e-4df0-a948-1601ad7f7354 req-c9acc7a1-e23a-4591-b25f-2d18e3133392 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-a9a1475a-af89-477b-bc8b-31a79fa63f3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:50 np0005601977 nova_compute[183130]: 2026-01-30 09:28:50.267 183134 DEBUG oslo_concurrency.lockutils [None req-f902c65e-76ec-4442-82d2-479a50cd91fa 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.351 183134 DEBUG nova.compute.manager [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.352 183134 DEBUG oslo_concurrency.lockutils [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.352 183134 DEBUG oslo_concurrency.lockutils [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.352 183134 DEBUG oslo_concurrency.lockutils [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.353 183134 DEBUG nova.compute.manager [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] No waiting events found dispatching network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.353 183134 WARNING nova.compute.manager [req-2cd30c12-bb72-4d00-a153-07eb879da6ae req-19c0484b-6ebd-4eef-9cf8-e067713eb377 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received unexpected event network-vif-plugged-620ad6fc-218e-4453-8811-06bd2d12bb67 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.609 183134 DEBUG nova.network.neutron [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.693 183134 INFO nova.compute.manager [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Took 3.49 seconds to deallocate network for instance.#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.749 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.750 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.758 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.795 183134 INFO nova.scheduler.client.report [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.812 183134 DEBUG nova.network.neutron [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updated VIF entry in instance network info cache for port 6a96c970-8213-4137-b6a7-4c31f1488ad5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.813 183134 DEBUG nova.network.neutron [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Updating instance_info_cache with network_info: [{"id": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "address": "fa:16:3e:54:7d:f1", "network": {"id": "9cf368f4-a96c-4392-8db3-50f404160fc3", "bridge": "br-int", "label": "tempest-network-smoke--456235833", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a96c970-82", "ovs_interfaceid": "6a96c970-8213-4137-b6a7-4c31f1488ad5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.850 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-37aaa571-2821-4d88-b360-9f7b02c1aa1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.851 183134 DEBUG nova.compute.manager [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-unplugged-620ad6fc-218e-4453-8811-06bd2d12bb67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.851 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.851 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.852 183134 DEBUG oslo_concurrency.lockutils [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "a9a1475a-af89-477b-bc8b-31a79fa63f3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.852 183134 DEBUG nova.compute.manager [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] No waiting events found dispatching network-vif-unplugged-620ad6fc-218e-4453-8811-06bd2d12bb67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.852 183134 DEBUG nova.compute.manager [req-8a14243a-4df2-4353-bad3-912054e0b0cb req-942e6e2f-dbaa-4b2a-b481-9e3c4df1f030 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Received event network-vif-unplugged-620ad6fc-218e-4453-8811-06bd2d12bb67 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.870 183134 DEBUG oslo_concurrency.lockutils [None req-a6f74acb-18bb-496d-9088-88a9ce8726c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.878 183134 DEBUG nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.879 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.879 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.879 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.879 183134 DEBUG nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.879 183134 WARNING nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-unplugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.880 183134 DEBUG nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.880 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.880 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.880 183134 DEBUG oslo_concurrency.lockutils [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "37aaa571-2821-4d88-b360-9f7b02c1aa1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.880 183134 DEBUG nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] No waiting events found dispatching network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:28:52 np0005601977 nova_compute[183130]: 2026-01-30 09:28:52.881 183134 WARNING nova.compute.manager [req-8639c281-4837-4ece-82a4-199202c4b523 req-44cd0152-dd05-4c96-ac08-9ff54b0f6824 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received unexpected event network-vif-plugged-6a96c970-8213-4137-b6a7-4c31f1488ad5 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:28:53 np0005601977 podman[215585]: 2026-01-30 09:28:53.893846872 +0000 UTC m=+0.079251873 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 30 04:28:54 np0005601977 nova_compute[183130]: 2026-01-30 09:28:54.128 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:54 np0005601977 nova_compute[183130]: 2026-01-30 09:28:54.598 183134 DEBUG nova.compute.manager [req-bb5d1c23-b213-424d-b34b-6282a7f641a5 req-8353c941-05f3-49a8-af5a-2e05687ba832 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Received event network-vif-deleted-6a96c970-8213-4137-b6a7-4c31f1488ad5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:28:54 np0005601977 nova_compute[183130]: 2026-01-30 09:28:54.598 183134 INFO nova.compute.manager [req-bb5d1c23-b213-424d-b34b-6282a7f641a5 req-8353c941-05f3-49a8-af5a-2e05687ba832 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Neutron deleted interface 6a96c970-8213-4137-b6a7-4c31f1488ad5; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:28:54 np0005601977 nova_compute[183130]: 2026-01-30 09:28:54.598 183134 DEBUG nova.network.neutron [req-bb5d1c23-b213-424d-b34b-6282a7f641a5 req-8353c941-05f3-49a8-af5a-2e05687ba832 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 30 04:28:54 np0005601977 nova_compute[183130]: 2026-01-30 09:28:54.600 183134 DEBUG nova.compute.manager [req-bb5d1c23-b213-424d-b34b-6282a7f641a5 req-8353c941-05f3-49a8-af5a-2e05687ba832 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Detach interface failed, port_id=6a96c970-8213-4137-b6a7-4c31f1488ad5, reason: Instance 37aaa571-2821-4d88-b360-9f7b02c1aa1b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:28:55 np0005601977 nova_compute[183130]: 2026-01-30 09:28:55.038 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:57Z|00136|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.159 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:28:57Z|00137|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.278 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.368 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.368 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:28:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:57.381 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:57.381 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:28:57.382 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.439 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.503 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.504 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.514 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.515 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.531 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.599 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.649 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.649 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.657 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.657 183134 INFO nova.compute.claims [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.781 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.782 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5448MB free_disk=73.3307876586914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.782 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.806 183134 DEBUG nova.compute.provider_tree [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.836 183134 DEBUG nova.scheduler.client.report [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.859 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.859 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.862 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.934 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.934 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.953 183134 INFO nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.962 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.962 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.963 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.963 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:28:57 np0005601977 nova_compute[183130]: 2026-01-30 09:28:57.972 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.050 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.077 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.099 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.100 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.101 183134 INFO nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Creating image(s)#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.101 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.101 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.102 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.112 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.113 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.113 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.186 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.188 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.189 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.213 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.277 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.278 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.305 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.306 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.307 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.383 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.384 183134 DEBUG nova.virt.disk.api [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.385 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.448 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.449 183134 DEBUG nova.virt.disk.api [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.450 183134 DEBUG nova.objects.instance [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.453 183134 DEBUG nova.policy [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.474 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.475 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Ensure instance console log exists: /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.475 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.476 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:58 np0005601977 nova_compute[183130]: 2026-01-30 09:28:58.476 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:59 np0005601977 nova_compute[183130]: 2026-01-30 09:28:59.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:00 np0005601977 nova_compute[183130]: 2026-01-30 09:29:00.012 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Successfully created port: 2011cfc4-3053-450f-9a91-99928686bc26 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:29:00 np0005601977 nova_compute[183130]: 2026-01-30 09:29:00.057 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:01 np0005601977 nova_compute[183130]: 2026-01-30 09:29:01.132 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765326.1314335, a9a1475a-af89-477b-bc8b-31a79fa63f3e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:01 np0005601977 nova_compute[183130]: 2026-01-30 09:29:01.133 183134 INFO nova.compute.manager [-] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:29:01 np0005601977 nova_compute[183130]: 2026-01-30 09:29:01.202 183134 DEBUG nova.compute.manager [None req-d7801f86-c5d1-4f65-8ad3-84751a9b6ac3 - - - - - -] [instance: a9a1475a-af89-477b-bc8b-31a79fa63f3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:02 np0005601977 podman[215634]: 2026-01-30 09:29:02.829292078 +0000 UTC m=+0.049740003 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.400 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Successfully updated port: 2011cfc4-3053-450f-9a91-99928686bc26 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.438 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.439 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.439 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.685 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.987 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765328.9867845, 37aaa571-2821-4d88-b360-9f7b02c1aa1b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:03 np0005601977 nova_compute[183130]: 2026-01-30 09:29:03.988 183134 INFO nova.compute.manager [-] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.020 183134 DEBUG nova.compute.manager [None req-ff641da3-9ac4-438c-9aca-4e459526c63f - - - - - -] [instance: 37aaa571-2821-4d88-b360-9f7b02c1aa1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.109 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.133 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.140 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.141 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.439 183134 DEBUG nova.compute.manager [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-changed-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.439 183134 DEBUG nova.compute.manager [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing instance network info cache due to event network-changed-2011cfc4-3053-450f-9a91-99928686bc26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:04 np0005601977 nova_compute[183130]: 2026-01-30 09:29:04.440 183134 DEBUG oslo_concurrency.lockutils [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.103 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.395 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.615 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.615 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.616 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.616 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7a073e24-c800-4962-af5e-ff5400800f34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.701 183134 DEBUG nova.network.neutron [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.743 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.743 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance network_info: |[{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.744 183134 DEBUG oslo_concurrency.lockutils [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.744 183134 DEBUG nova.network.neutron [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.749 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Start _get_guest_xml network_info=[{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.756 183134 WARNING nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.762 183134 DEBUG nova.virt.libvirt.host [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.762 183134 DEBUG nova.virt.libvirt.host [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.771 183134 DEBUG nova.virt.libvirt.host [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.772 183134 DEBUG nova.virt.libvirt.host [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.773 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.774 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.775 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.775 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.775 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.776 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.776 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.777 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.777 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.778 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.778 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.779 183134 DEBUG nova.virt.hardware [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.784 183134 DEBUG nova.virt.libvirt.vif [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:28:58Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.785 183134 DEBUG nova.network.os_vif_util [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.786 183134 DEBUG nova.network.os_vif_util [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.787 183134 DEBUG nova.objects.instance [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.820 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <uuid>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</uuid>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <name>instance-00000013</name>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:29:05</nova:creationTime>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="serial">65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="uuid">65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:57:c6:12"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <target dev="tap2011cfc4-30"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log" append="off"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:29:05 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:05 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:05 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:05 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.821 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Preparing to wait for external event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.822 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.822 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.823 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.823 183134 DEBUG nova.virt.libvirt.vif [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:28:58Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.824 183134 DEBUG nova.network.os_vif_util [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.825 183134 DEBUG nova.network.os_vif_util [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.825 183134 DEBUG os_vif [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.826 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.826 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.827 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.830 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.831 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2011cfc4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.831 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2011cfc4-30, col_values=(('external_ids', {'iface-id': '2011cfc4-3053-450f-9a91-99928686bc26', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:c6:12', 'vm-uuid': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.833 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:05 np0005601977 NetworkManager[55565]: <info>  [1769765345.8347] manager: (tap2011cfc4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.836 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.841 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.842 183134 INFO os_vif [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30')#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.910 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.911 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.911 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:57:c6:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:29:05 np0005601977 nova_compute[183130]: 2026-01-30 09:29:05.912 183134 INFO nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Using config drive#033[00m
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.393 183134 INFO nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Creating config drive at /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config#033[00m
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.400 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpay7afpgi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.522 183134 DEBUG oslo_concurrency.processutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpay7afpgi" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:06 np0005601977 kernel: tap2011cfc4-30: entered promiscuous mode
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.5694] manager: (tap2011cfc4-30): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 30 04:29:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:06Z|00138|binding|INFO|Claiming lport 2011cfc4-3053-450f-9a91-99928686bc26 for this chassis.
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.570 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:06Z|00139|binding|INFO|2011cfc4-3053-450f-9a91-99928686bc26: Claiming fa:16:3e:57:c6:12 10.100.0.11
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.574 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 systemd-udevd[215677]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.600 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:c6:12 10.100.0.11'], port_security=['fa:16:3e:57:c6:12 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34535701-9131-4137-9b04-abc5c4bde788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7d566d0-1254-4641-872b-bbe4cfbb0f9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbacebc3-4b49-42c7-a1e2-a9ffa8ce4adb, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=2011cfc4-3053-450f-9a91-99928686bc26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.601 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 2011cfc4-3053-450f-9a91-99928686bc26 in datapath 34535701-9131-4137-9b04-abc5c4bde788 bound to our chassis#033[00m
Jan 30 04:29:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:06Z|00140|binding|INFO|Setting lport 2011cfc4-3053-450f-9a91-99928686bc26 ovn-installed in OVS
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.603 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34535701-9131-4137-9b04-abc5c4bde788#033[00m
Jan 30 04:29:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:06Z|00141|binding|INFO|Setting lport 2011cfc4-3053-450f-9a91-99928686bc26 up in Southbound
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.604 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 systemd-machined[154431]: New machine qemu-11-instance-00000013.
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.6110] device (tap2011cfc4-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.6116] device (tap2011cfc4-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.611 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2528b761-743a-49e1-9c0c-277532ad20f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.612 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap34535701-91 in ovnmeta-34535701-9131-4137-9b04-abc5c4bde788 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.615 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap34535701-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.615 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[75ba4438-be7c-49bc-93ea-251772bea059]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.616 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[41a08e24-8cc0-46ae-a62c-8d256b765a4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 systemd[1]: Started Virtual Machine qemu-11-instance-00000013.
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.629 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[01e3e028-1dae-4b6d-80ef-42bed3e2d9f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.641 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bb454764-8a55-4bde-8f1b-c95102a19944]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.665 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ef5ff3-b8c2-4100-a030-a0229794cfe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.671 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7afcf379-0114-4c8b-93b7-05de7ef1e4c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 systemd-udevd[215681]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.6725] manager: (tap34535701-90): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.697 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2d088c6e-d645-4fdf-a4c2-b87a2668a53f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.700 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[58224e0f-feda-4e50-ba86-310915da0779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.7191] device (tap34535701-90): carrier: link connected
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.724 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9eee9e67-9bfa-45d4-bca8-9dfd934a25fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.738 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1160ee8d-83bd-478e-8d04-ac320091e757]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34535701-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:ce:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387805, 'reachable_time': 43670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215711, 'error': None, 'target': 'ovnmeta-34535701-9131-4137-9b04-abc5c4bde788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.750 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8a221a79-84f1-4cda-bbdf-4380a2adc792]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:ce74'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387805, 'tstamp': 387805}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215712, 'error': None, 'target': 'ovnmeta-34535701-9131-4137-9b04-abc5c4bde788', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.765 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bcea441e-4a48-45cd-94ef-df97af88f90c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34535701-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:ce:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387805, 'reachable_time': 43670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215713, 'error': None, 'target': 'ovnmeta-34535701-9131-4137-9b04-abc5c4bde788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.792 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[428f4e53-0529-4220-8cf4-77f09495d650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.858 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[35c6c34a-07ec-45c8-b174-059f6fbda863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.859 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34535701-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.860 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.860 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34535701-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:06 np0005601977 kernel: tap34535701-90: entered promiscuous mode
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.862 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 NetworkManager[55565]: <info>  [1769765346.8640] manager: (tap34535701-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.863 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.864 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34535701-90, col_values=(('external_ids', {'iface-id': 'd9b977ba-c89f-4674-bf61-3e41f1337fa6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:06Z|00142|binding|INFO|Releasing lport d9b977ba-c89f-4674-bf61-3e41f1337fa6 from this chassis (sb_readonly=0)
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.865 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.867 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.867 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/34535701-9131-4137-9b04-abc5c4bde788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/34535701-9131-4137-9b04-abc5c4bde788.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.868 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[893a2bc3-8bcc-46fa-b215-b12ce9375be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.869 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-34535701-9131-4137-9b04-abc5c4bde788
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/34535701-9131-4137-9b04-abc5c4bde788.pid.haproxy
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 34535701-9131-4137-9b04-abc5c4bde788
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:29:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:06.869 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-34535701-9131-4137-9b04-abc5c4bde788', 'env', 'PROCESS_TAG=haproxy-34535701-9131-4137-9b04-abc5c4bde788', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/34535701-9131-4137-9b04-abc5c4bde788.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:29:06 np0005601977 nova_compute[183130]: 2026-01-30 09:29:06.871 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.176 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765347.1753123, 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.177 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] VM Started (Lifecycle Event)#033[00m
Jan 30 04:29:07 np0005601977 podman[215752]: 2026-01-30 09:29:07.213704664 +0000 UTC m=+0.053387838 container create c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.219 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.225 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765347.176119, 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.225 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:29:07 np0005601977 systemd[1]: Started libpod-conmon-c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5.scope.
Jan 30 04:29:07 np0005601977 podman[215752]: 2026-01-30 09:29:07.183488414 +0000 UTC m=+0.023171638 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:29:07 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.289 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:07 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd4da329e6c7463504556dda0aa5cf88dab6757f711f19e721760d8db4d57b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.296 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:07 np0005601977 podman[215752]: 2026-01-30 09:29:07.306156105 +0000 UTC m=+0.145839269 container init c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 30 04:29:07 np0005601977 podman[215752]: 2026-01-30 09:29:07.311947492 +0000 UTC m=+0.151630636 container start c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:29:07 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [NOTICE]   (215771) : New worker (215773) forked
Jan 30 04:29:07 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [NOTICE]   (215771) : Loading success.
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.377 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.399 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [{"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.433 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.433 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.433 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.434 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.435 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:07 np0005601977 nova_compute[183130]: 2026-01-30 09:29:07.435 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.071 183134 DEBUG nova.network.neutron [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updated VIF entry in instance network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.071 183134 DEBUG nova.network.neutron [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.226 183134 DEBUG oslo_concurrency.lockutils [req-8fff8906-6345-450a-87b0-0cb77712e863 req-61a6cbd0-aeff-4d15-94f6-916be3d2f026 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.720 183134 DEBUG nova.compute.manager [req-2a5943ae-dc99-494d-9fea-75df92c33715 req-0df7e301-713d-4de4-b12d-336420d6974d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.721 183134 DEBUG oslo_concurrency.lockutils [req-2a5943ae-dc99-494d-9fea-75df92c33715 req-0df7e301-713d-4de4-b12d-336420d6974d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.722 183134 DEBUG oslo_concurrency.lockutils [req-2a5943ae-dc99-494d-9fea-75df92c33715 req-0df7e301-713d-4de4-b12d-336420d6974d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.723 183134 DEBUG oslo_concurrency.lockutils [req-2a5943ae-dc99-494d-9fea-75df92c33715 req-0df7e301-713d-4de4-b12d-336420d6974d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.723 183134 DEBUG nova.compute.manager [req-2a5943ae-dc99-494d-9fea-75df92c33715 req-0df7e301-713d-4de4-b12d-336420d6974d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Processing event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.724 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.729 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765348.729113, 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.730 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.732 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.737 183134 INFO nova.virt.libvirt.driver [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance spawned successfully.#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.737 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.829 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.835 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.835 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.836 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.837 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.837 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.838 183134 DEBUG nova.virt.libvirt.driver [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.845 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:08 np0005601977 nova_compute[183130]: 2026-01-30 09:29:08.916 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:09 np0005601977 nova_compute[183130]: 2026-01-30 09:29:09.158 183134 INFO nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Took 11.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:29:09 np0005601977 nova_compute[183130]: 2026-01-30 09:29:09.158 183134 DEBUG nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:09 np0005601977 nova_compute[183130]: 2026-01-30 09:29:09.251 183134 INFO nova.compute.manager [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Took 11.64 seconds to build instance.#033[00m
Jan 30 04:29:09 np0005601977 nova_compute[183130]: 2026-01-30 09:29:09.436 183134 DEBUG oslo_concurrency.lockutils [None req-9cac2ea7-f900-44bc-ba71-0ebeb1f9af01 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:10 np0005601977 nova_compute[183130]: 2026-01-30 09:29:10.145 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:10 np0005601977 nova_compute[183130]: 2026-01-30 09:29:10.430 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:10 np0005601977 nova_compute[183130]: 2026-01-30 09:29:10.834 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.064 183134 DEBUG nova.compute.manager [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.065 183134 DEBUG oslo_concurrency.lockutils [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.065 183134 DEBUG oslo_concurrency.lockutils [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.066 183134 DEBUG oslo_concurrency.lockutils [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.066 183134 DEBUG nova.compute.manager [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:11 np0005601977 nova_compute[183130]: 2026-01-30 09:29:11.066 183134 WARNING nova.compute.manager [req-a08e7348-8bac-49f1-8f72-b80a78503f75 req-f11e71b9-d8b4-4768-a3ef-88051e974ff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:11 np0005601977 podman[215783]: 2026-01-30 09:29:11.837942553 +0000 UTC m=+0.055291843 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:29:11 np0005601977 podman[215782]: 2026-01-30 09:29:11.855607861 +0000 UTC m=+0.076042970 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z)
Jan 30 04:29:13 np0005601977 NetworkManager[55565]: <info>  [1769765353.3628] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.362 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:13 np0005601977 NetworkManager[55565]: <info>  [1769765353.3639] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.417 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:13 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:13Z|00143|binding|INFO|Releasing lport d9b977ba-c89f-4674-bf61-3e41f1337fa6 from this chassis (sb_readonly=0)
Jan 30 04:29:13 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:13Z|00144|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.452 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.983 183134 DEBUG nova.compute.manager [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-changed-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.983 183134 DEBUG nova.compute.manager [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing instance network info cache due to event network-changed-2011cfc4-3053-450f-9a91-99928686bc26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.984 183134 DEBUG oslo_concurrency.lockutils [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.984 183134 DEBUG oslo_concurrency.lockutils [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:13 np0005601977 nova_compute[183130]: 2026-01-30 09:29:13.984 183134 DEBUG nova.network.neutron [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:15 np0005601977 nova_compute[183130]: 2026-01-30 09:29:15.148 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:15 np0005601977 nova_compute[183130]: 2026-01-30 09:29:15.836 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:15 np0005601977 nova_compute[183130]: 2026-01-30 09:29:15.890 183134 DEBUG nova.network.neutron [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updated VIF entry in instance network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:15 np0005601977 nova_compute[183130]: 2026-01-30 09:29:15.891 183134 DEBUG nova.network.neutron [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:15 np0005601977 nova_compute[183130]: 2026-01-30 09:29:15.921 183134 DEBUG oslo_concurrency.lockutils [req-491a4992-987a-4b01-9817-21e9c862593e req-9a2c8d5d-5d0d-4141-b37e-0c1230bb12af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.393 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.737 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.738 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.771 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.864 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.865 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.875 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:29:18 np0005601977 nova_compute[183130]: 2026-01-30 09:29:18.875 183134 INFO nova.compute.claims [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.170 183134 DEBUG nova.compute.provider_tree [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.186 183134 DEBUG nova.scheduler.client.report [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.211 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.212 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.293 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.294 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.325 183134 INFO nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.360 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.493 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.496 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.497 183134 INFO nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Creating image(s)#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.498 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.498 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.499 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.527 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.609 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.611 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.612 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.629 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.687 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.689 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.713 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.714 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.715 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.764 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.765 183134 DEBUG nova.virt.disk.api [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.766 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.823 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.824 183134 DEBUG nova.virt.disk.api [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:29:19 np0005601977 podman[215853]: 2026-01-30 09:29:19.825110689 +0000 UTC m=+0.047212100 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.825 183134 DEBUG nova.objects.instance [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid 93629e5c-ca92-47ac-8567-35d85b4e2a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:19 np0005601977 podman[215850]: 2026-01-30 09:29:19.827637392 +0000 UTC m=+0.050215857 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.857 183134 DEBUG nova.policy [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.864 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.864 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Ensure instance console log exists: /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.865 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.865 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:19 np0005601977 nova_compute[183130]: 2026-01-30 09:29:19.866 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:20 np0005601977 nova_compute[183130]: 2026-01-30 09:29:20.183 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:20Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:c6:12 10.100.0.11
Jan 30 04:29:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:20Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:c6:12 10.100.0.11
Jan 30 04:29:20 np0005601977 nova_compute[183130]: 2026-01-30 09:29:20.838 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:21 np0005601977 nova_compute[183130]: 2026-01-30 09:29:21.510 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Successfully created port: 695209cb-0de3-443c-9e7f-c65894975f23 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.871 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.872 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.893 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Successfully updated port: 695209cb-0de3-443c-9e7f-c65894975f23 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.894 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.926 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.926 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.927 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.993 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:22 np0005601977 nova_compute[183130]: 2026-01-30 09:29:22.994 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.002 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.002 183134 INFO nova.compute.claims [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.031 183134 DEBUG nova.compute.manager [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-changed-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.031 183134 DEBUG nova.compute.manager [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing instance network info cache due to event network-changed-695209cb-0de3-443c-9e7f-c65894975f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.032 183134 DEBUG oslo_concurrency.lockutils [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.198 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.227 183134 DEBUG nova.compute.provider_tree [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.246 183134 DEBUG nova.scheduler.client.report [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.282 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.282 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.335 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.336 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.362 183134 INFO nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.381 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.521 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.522 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.523 183134 INFO nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Creating image(s)#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.523 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.525 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.525 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.537 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.608 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.609 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.609 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.618 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.670 183134 DEBUG nova.policy [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.697 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.699 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.729 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.730 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.731 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.789 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.790 183134 DEBUG nova.virt.disk.api [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.791 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.866 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.898 183134 DEBUG nova.virt.disk.api [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.899 183134 DEBUG nova.objects.instance [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 362db3ae-3984-411e-994b-55924dc0c06f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.928 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.929 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Ensure instance console log exists: /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.929 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.930 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:23 np0005601977 nova_compute[183130]: 2026-01-30 09:29:23.930 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.532 183134 DEBUG nova.network.neutron [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.566 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.567 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Instance network_info: |[{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.569 183134 DEBUG oslo_concurrency.lockutils [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.569 183134 DEBUG nova.network.neutron [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.576 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Start _get_guest_xml network_info=[{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.582 183134 WARNING nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.586 183134 DEBUG nova.virt.libvirt.host [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.587 183134 DEBUG nova.virt.libvirt.host [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.591 183134 DEBUG nova.virt.libvirt.host [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.591 183134 DEBUG nova.virt.libvirt.host [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.592 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.592 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.593 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.593 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.593 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.593 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.594 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.594 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.594 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.594 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.594 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.595 183134 DEBUG nova.virt.hardware [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.598 183134 DEBUG nova.virt.libvirt.vif [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:29:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=20,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-6j7tnxpj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:29:19Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=93629e5c-ca92-47ac-8567-35d85b4e2a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.598 183134 DEBUG nova.network.os_vif_util [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.599 183134 DEBUG nova.network.os_vif_util [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.599 183134 DEBUG nova.objects.instance [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid 93629e5c-ca92-47ac-8567-35d85b4e2a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.616 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <uuid>93629e5c-ca92-47ac-8567-35d85b4e2a73</uuid>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <name>instance-00000014</name>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569</nova:name>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:29:24</nova:creationTime>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        <nova:port uuid="695209cb-0de3-443c-9e7f-c65894975f23">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="serial">93629e5c-ca92-47ac-8567-35d85b4e2a73</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="uuid">93629e5c-ca92-47ac-8567-35d85b4e2a73</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.config"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:33:ea:ed"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <target dev="tap695209cb-0d"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/console.log" append="off"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:29:24 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:24 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:24 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:24 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.617 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Preparing to wait for external event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.618 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.618 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.618 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.619 183134 DEBUG nova.virt.libvirt.vif [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:29:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=20,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-6j7tnxpj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:29:19Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=93629e5c-ca92-47ac-8567-35d85b4e2a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.619 183134 DEBUG nova.network.os_vif_util [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.620 183134 DEBUG nova.network.os_vif_util [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.621 183134 DEBUG os_vif [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.622 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.622 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.625 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap695209cb-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.625 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap695209cb-0d, col_values=(('external_ids', {'iface-id': '695209cb-0de3-443c-9e7f-c65894975f23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:ea:ed', 'vm-uuid': '93629e5c-ca92-47ac-8567-35d85b4e2a73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:24 np0005601977 NetworkManager[55565]: <info>  [1769765364.6279] manager: (tap695209cb-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.628 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.633 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.635 183134 INFO os_vif [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d')#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.654 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Successfully created port: 4433db17-a607-4a44-9251-c5e602dc0576 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.695 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.695 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.695 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:33:ea:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:29:24 np0005601977 nova_compute[183130]: 2026-01-30 09:29:24.696 183134 INFO nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Using config drive#033[00m
Jan 30 04:29:24 np0005601977 podman[215915]: 2026-01-30 09:29:24.726721211 +0000 UTC m=+0.066067549 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.213 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.321 183134 INFO nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Creating config drive at /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.config#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.325 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdwpyka4r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.362 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Creating tmpfile /var/lib/nova/instances/tmp6r9pss4b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.363 183134 DEBUG nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6r9pss4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.440 183134 DEBUG oslo_concurrency.processutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdwpyka4r" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:25 np0005601977 kernel: tap695209cb-0d: entered promiscuous mode
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.4738] manager: (tap695209cb-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.475 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:25Z|00145|binding|INFO|Claiming lport 695209cb-0de3-443c-9e7f-c65894975f23 for this chassis.
Jan 30 04:29:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:25Z|00146|binding|INFO|695209cb-0de3-443c-9e7f-c65894975f23: Claiming fa:16:3e:33:ea:ed 10.100.0.8
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.481 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:25Z|00147|binding|INFO|Setting lport 695209cb-0de3-443c-9e7f-c65894975f23 ovn-installed in OVS
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.483 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:25Z|00148|binding|INFO|Setting lport 695209cb-0de3-443c-9e7f-c65894975f23 up in Southbound
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.484 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.487 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:ea:ed 10.100.0.8'], port_security=['fa:16:3e:33:ea:ed 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a866993-35dd-4fa6-b18e-da0d2901678a 6a1909f5-bead-4d28-9b18-810f48b11797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3cddb3-a489-4457-a955-237f0d7cc907, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=695209cb-0de3-443c-9e7f-c65894975f23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.489 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 695209cb-0de3-443c-9e7f-c65894975f23 in datapath baf5a6be-5cb0-4dff-8451-d79eaebce0be bound to our chassis#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.492 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baf5a6be-5cb0-4dff-8451-d79eaebce0be#033[00m
Jan 30 04:29:25 np0005601977 systemd-udevd[215958]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.501 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[82730d5a-4318-43bf-ab94-838bc21337e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 systemd-machined[154431]: New machine qemu-12-instance-00000014.
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.502 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbaf5a6be-51 in ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.5038] device (tap695209cb-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.5042] device (tap695209cb-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.505 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbaf5a6be-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.505 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3201d4f9-3263-4c4e-9b6a-535dc78ee4b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.506 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d3feee34-8598-4f42-8ed7-76c5baae8587]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 systemd[1]: Started Virtual Machine qemu-12-instance-00000014.
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.512 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[84371467-7279-4d10-943d-ddc4a75213db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.532 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[824c375c-7699-4d89-9e6b-2a4dae0ce94d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.548 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc77be4-8530-45ed-a35c-878b87ab3608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.5538] manager: (tapbaf5a6be-50): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.554 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[acc9a428-9709-4132-9d84-19911c199abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.573 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[51bdb7a8-2b65-46ee-86e6-f3870246ce67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.575 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5481f9cd-45d4-4295-8976-96e75b5c7925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.5894] device (tapbaf5a6be-50): carrier: link connected
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.593 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a738ddc6-67c1-4faf-afa6-7a543a83064d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.604 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[06d99661-ee03-4182-85b7-fd51d8b287f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5a6be-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389692, 'reachable_time': 36864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215992, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.616 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4d50e0d7-2958-48d4-b3fd-643b6e593495]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:7f9c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389692, 'tstamp': 389692}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215993, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.624 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff8cb80-8c49-435b-9b83-404bad95e1f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5a6be-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389692, 'reachable_time': 36864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215994, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.646 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[88152462-4a63-4780-ae05-ac338b85de9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.687 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f6638f04-088d-4d6b-9f01-fad9c6206592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.688 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5a6be-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.689 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.689 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaf5a6be-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:25 np0005601977 kernel: tapbaf5a6be-50: entered promiscuous mode
Jan 30 04:29:25 np0005601977 NetworkManager[55565]: <info>  [1769765365.6917] manager: (tapbaf5a6be-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.693 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.696 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaf5a6be-50, col_values=(('external_ids', {'iface-id': '663ef153-23ef-4ecf-ab76-b6916e4933b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:25Z|00149|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.698 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/baf5a6be-5cb0-4dff-8451-d79eaebce0be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/baf5a6be-5cb0-4dff-8451-d79eaebce0be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.701 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.701 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e65393de-787b-4cd2-9f2f-f086f53ee2cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.702 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-baf5a6be-5cb0-4dff-8451-d79eaebce0be
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/baf5a6be-5cb0-4dff-8451-d79eaebce0be.pid.haproxy
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID baf5a6be-5cb0-4dff-8451-d79eaebce0be
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:29:25 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:25.702 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'env', 'PROCESS_TAG=haproxy-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/baf5a6be-5cb0-4dff-8451-d79eaebce0be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.975 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765365.974901, 93629e5c-ca92-47ac-8567-35d85b4e2a73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:25 np0005601977 nova_compute[183130]: 2026-01-30 09:29:25.975 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] VM Started (Lifecycle Event)#033[00m
Jan 30 04:29:26 np0005601977 podman[216033]: 2026-01-30 09:29:26.068383362 +0000 UTC m=+0.054447797 container create 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:29:26 np0005601977 systemd[1]: Started libpod-conmon-0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48.scope.
Jan 30 04:29:26 np0005601977 podman[216033]: 2026-01-30 09:29:26.043004097 +0000 UTC m=+0.029068582 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:29:26 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:29:26 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78de20b395c1a1012a1f79dfd12ff1ff3d2ab27fb225c22e144a19b893a92898/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:29:26 np0005601977 podman[216033]: 2026-01-30 09:29:26.162018367 +0000 UTC m=+0.148082882 container init 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:29:26 np0005601977 podman[216033]: 2026-01-30 09:29:26.169931963 +0000 UTC m=+0.155996428 container start 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 30 04:29:26 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [NOTICE]   (216052) : New worker (216054) forked
Jan 30 04:29:26 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [NOTICE]   (216052) : Loading success.
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.227 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.233 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765365.9775324, 93629e5c-ca92-47ac-8567-35d85b4e2a73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.234 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.251 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.255 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:26 np0005601977 nova_compute[183130]: 2026-01-30 09:29:26.384 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.475 183134 DEBUG nova.compute.manager [req-cd753274-d4d4-4f77-a193-e3d7e7d15cfd req-82ac48ed-834c-4f63-b30f-8b25c830343c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.476 183134 DEBUG oslo_concurrency.lockutils [req-cd753274-d4d4-4f77-a193-e3d7e7d15cfd req-82ac48ed-834c-4f63-b30f-8b25c830343c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.477 183134 DEBUG oslo_concurrency.lockutils [req-cd753274-d4d4-4f77-a193-e3d7e7d15cfd req-82ac48ed-834c-4f63-b30f-8b25c830343c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.477 183134 DEBUG oslo_concurrency.lockutils [req-cd753274-d4d4-4f77-a193-e3d7e7d15cfd req-82ac48ed-834c-4f63-b30f-8b25c830343c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.478 183134 DEBUG nova.compute.manager [req-cd753274-d4d4-4f77-a193-e3d7e7d15cfd req-82ac48ed-834c-4f63-b30f-8b25c830343c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Processing event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.479 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.483 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765368.483357, 93629e5c-ca92-47ac-8567-35d85b4e2a73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.484 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.487 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.490 183134 INFO nova.virt.libvirt.driver [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Instance spawned successfully.#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.491 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.505 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.512 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.515 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.516 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.517 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.518 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.518 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.519 183134 DEBUG nova.virt.libvirt.driver [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.534 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.603 183134 INFO nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Took 9.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.604 183134 DEBUG nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.666 183134 INFO nova.compute.manager [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Took 9.84 seconds to build instance.#033[00m
Jan 30 04:29:28 np0005601977 nova_compute[183130]: 2026-01-30 09:29:28.685 183134 DEBUG oslo_concurrency.lockutils [None req-cd711506-03c8-4874-b10b-a8618a9536b8 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:29 np0005601977 nova_compute[183130]: 2026-01-30 09:29:29.538 183134 DEBUG nova.network.neutron [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updated VIF entry in instance network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:29 np0005601977 nova_compute[183130]: 2026-01-30 09:29:29.539 183134 DEBUG nova.network.neutron [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:29 np0005601977 nova_compute[183130]: 2026-01-30 09:29:29.562 183134 DEBUG oslo_concurrency.lockutils [req-06290f9c-5771-45e8-8074-cbe71f73cf52 req-74b2003d-024c-4146-8721-c09f0a6baff5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:29 np0005601977 nova_compute[183130]: 2026-01-30 09:29:29.671 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.095 183134 INFO nova.compute.manager [None req-eb447b5c-da31-4841-9562-9eeade64f93a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Get console output#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.100 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.121 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:30.121 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:30.122 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.215 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.635 183134 DEBUG nova.compute.manager [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.635 183134 DEBUG oslo_concurrency.lockutils [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.636 183134 DEBUG oslo_concurrency.lockutils [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.636 183134 DEBUG oslo_concurrency.lockutils [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.636 183134 DEBUG nova.compute.manager [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] No waiting events found dispatching network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.636 183134 WARNING nova.compute.manager [req-02efa210-c640-41ee-b9b8-73487e78fe2d req-117ee5a6-7395-4adf-817a-6914ec6202f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received unexpected event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.731 183134 DEBUG nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6r9pss4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22bc0323-ee7d-4b6e-992e-a2410bf240e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.755 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.755 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquired lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:30 np0005601977 nova_compute[183130]: 2026-01-30 09:29:30.755 183134 DEBUG nova.network.neutron [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:31.125 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:31 np0005601977 nova_compute[183130]: 2026-01-30 09:29:31.385 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Successfully updated port: 4433db17-a607-4a44-9251-c5e602dc0576 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:29:31 np0005601977 nova_compute[183130]: 2026-01-30 09:29:31.424 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:31 np0005601977 nova_compute[183130]: 2026-01-30 09:29:31.424 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:31 np0005601977 nova_compute[183130]: 2026-01-30 09:29:31.424 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:32 np0005601977 nova_compute[183130]: 2026-01-30 09:29:32.422 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:29:32 np0005601977 nova_compute[183130]: 2026-01-30 09:29:32.784 183134 DEBUG nova.compute.manager [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-changed-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:32 np0005601977 nova_compute[183130]: 2026-01-30 09:29:32.784 183134 DEBUG nova.compute.manager [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing instance network info cache due to event network-changed-4433db17-a607-4a44-9251-c5e602dc0576. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:32 np0005601977 nova_compute[183130]: 2026-01-30 09:29:32.785 183134 DEBUG oslo_concurrency.lockutils [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:33 np0005601977 podman[216063]: 2026-01-30 09:29:33.835968483 +0000 UTC m=+0.056806444 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.475 183134 DEBUG nova.network.neutron [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updating instance_info_cache with network_info: [{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.484 183134 DEBUG nova.network.neutron [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Updating instance_info_cache with network_info: [{"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.508 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Releasing lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.510 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6r9pss4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22bc0323-ee7d-4b6e-992e-a2410bf240e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.511 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Creating instance directory: /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.511 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Creating disk.info with the contents: {'/var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk': 'qcow2', '/var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.512 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.512 183134 DEBUG nova.objects.instance [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 22bc0323-ee7d-4b6e-992e-a2410bf240e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.514 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.514 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Instance network_info: |[{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.514 183134 DEBUG oslo_concurrency.lockutils [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.515 183134 DEBUG nova.network.neutron [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.518 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Start _get_guest_xml network_info=[{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.523 183134 WARNING nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.540 183134 DEBUG nova.virt.libvirt.host [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.541 183134 DEBUG nova.virt.libvirt.host [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.545 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.572 183134 DEBUG nova.virt.libvirt.host [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.573 183134 DEBUG nova.virt.libvirt.host [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.575 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.576 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.576 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.576 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.577 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.577 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.577 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.577 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.578 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.578 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.578 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.578 183134 DEBUG nova.virt.hardware [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.583 183134 DEBUG nova.virt.libvirt.vif [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:29:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-149082075',display_name='tempest-TestNetworkAdvancedServerOps-server-149082075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-149082075',id=21,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKizssFknmOKm5eKVAWB6797WgwhrtXfhU+/0cyecNJIu+hHN3gvLXGJMRDzhKYD8/8v9exvNpKsHuhoX+8PPA8mlsBy0hC0QpmrhJ0OUKXCR52DAu2aaKvfZix0Lc+IiQ==',key_name='tempest-TestNetworkAdvancedServerOps-1084377927',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-e0w57m40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:29:23Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=362db3ae-3984-411e-994b-55924dc0c06f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.583 183134 DEBUG nova.network.os_vif_util [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.584 183134 DEBUG nova.network.os_vif_util [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.585 183134 DEBUG nova.objects.instance [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 362db3ae-3984-411e-994b-55924dc0c06f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.606 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <uuid>362db3ae-3984-411e-994b-55924dc0c06f</uuid>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <name>instance-00000015</name>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-149082075</nova:name>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:29:34</nova:creationTime>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        <nova:port uuid="4433db17-a607-4a44-9251-c5e602dc0576">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="serial">362db3ae-3984-411e-994b-55924dc0c06f</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="uuid">362db3ae-3984-411e-994b-55924dc0c06f</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.config"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:1c:24:1c"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <target dev="tap4433db17-a6"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/console.log" append="off"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:29:34 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:34 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:34 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:34 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.608 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Preparing to wait for external event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.608 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.608 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.609 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.609 183134 DEBUG nova.virt.libvirt.vif [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:29:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-149082075',display_name='tempest-TestNetworkAdvancedServerOps-server-149082075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-149082075',id=21,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKizssFknmOKm5eKVAWB6797WgwhrtXfhU+/0cyecNJIu+hHN3gvLXGJMRDzhKYD8/8v9exvNpKsHuhoX+8PPA8mlsBy0hC0QpmrhJ0OUKXCR52DAu2aaKvfZix0Lc+IiQ==',key_name='tempest-TestNetworkAdvancedServerOps-1084377927',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-e0w57m40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:29:23Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=362db3ae-3984-411e-994b-55924dc0c06f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.610 183134 DEBUG nova.network.os_vif_util [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.611 183134 DEBUG nova.network.os_vif_util [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.611 183134 DEBUG os_vif [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.612 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.612 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.613 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.617 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.618 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4433db17-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.619 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4433db17-a6, col_values=(('external_ids', {'iface-id': '4433db17-a607-4a44-9251-c5e602dc0576', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:24:1c', 'vm-uuid': '362db3ae-3984-411e-994b-55924dc0c06f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:34 np0005601977 NetworkManager[55565]: <info>  [1769765374.6222] manager: (tap4433db17-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.625 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.626 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.626 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.642 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.654 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.656 183134 INFO os_vif [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6')#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.689 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.690 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.714 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.714 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.715 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.761 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.762 183134 DEBUG nova.virt.disk.api [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Checking if we can resize image /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.762 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.809 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.811 183134 DEBUG nova.virt.disk.api [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Cannot resize image /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.811 183134 DEBUG nova.objects.instance [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lazy-loading 'migration_context' on Instance uuid 22bc0323-ee7d-4b6e-992e-a2410bf240e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.910 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.929 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config 485376" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.930 183134 DEBUG nova.virt.libvirt.volume.remotefs [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config to /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.931 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.949 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.950 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.950 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:1c:24:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:29:34 np0005601977 nova_compute[183130]: 2026-01-30 09:29:34.950 183134 INFO nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Using config drive#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.216 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.420 183134 DEBUG oslo_concurrency.processutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.config /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.421 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.422 183134 DEBUG nova.virt.libvirt.vif [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:27:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-481398456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-481398456',id=16,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-5f3v94ca',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:29:19Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=22bc0323-ee7d-4b6e-992e-a2410bf240e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.422 183134 DEBUG nova.network.os_vif_util [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converting VIF {"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.423 183134 DEBUG nova.network.os_vif_util [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.424 183134 DEBUG os_vif [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.424 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.425 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.425 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.429 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.429 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape680749e-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.430 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape680749e-01, col_values=(('external_ids', {'iface-id': 'e680749e-01e2-462e-8755-8b4f01e1272e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:37:22', 'vm-uuid': '22bc0323-ee7d-4b6e-992e-a2410bf240e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:35 np0005601977 NetworkManager[55565]: <info>  [1769765375.4334] manager: (tape680749e-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.432 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.436 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.440 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.442 183134 INFO os_vif [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01')#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.442 183134 DEBUG nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.443 183134 DEBUG nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6r9pss4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22bc0323-ee7d-4b6e-992e-a2410bf240e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.854 183134 INFO nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Creating config drive at /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.config#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.859 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu8j8vml_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:35 np0005601977 nova_compute[183130]: 2026-01-30 09:29:35.976 183134 DEBUG oslo_concurrency.processutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu8j8vml_" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.0285] manager: (tap4433db17-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 30 04:29:36 np0005601977 kernel: tap4433db17-a6: entered promiscuous mode
Jan 30 04:29:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:36Z|00150|binding|INFO|Claiming lport 4433db17-a607-4a44-9251-c5e602dc0576 for this chassis.
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.035 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:36Z|00151|binding|INFO|4433db17-a607-4a44-9251-c5e602dc0576: Claiming fa:16:3e:1c:24:1c 10.100.0.14
Jan 30 04:29:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:36Z|00152|binding|INFO|Setting lport 4433db17-a607-4a44-9251-c5e602dc0576 ovn-installed in OVS
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.057 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.059 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:24:1c 10.100.0.14'], port_security=['fa:16:3e:1c:24:1c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60eba810-de66-4c2e-8c3c-70333d77e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9d31663-60dd-457d-986c-66184f7449fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e93af628-2f8e-4aae-a24f-d949db135cfb, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4433db17-a607-4a44-9251-c5e602dc0576) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.060 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4433db17-a607-4a44-9251-c5e602dc0576 in datapath 60eba810-de66-4c2e-8c3c-70333d77e79c bound to our chassis#033[00m
Jan 30 04:29:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:36Z|00153|binding|INFO|Setting lport 4433db17-a607-4a44-9251-c5e602dc0576 up in Southbound
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.062 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60eba810-de66-4c2e-8c3c-70333d77e79c#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.068 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee77c69-ecde-4d35-8682-d5fcaeee6f8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.069 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap60eba810-d1 in ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:29:36 np0005601977 systemd-machined[154431]: New machine qemu-13-instance-00000015.
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.070 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap60eba810-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.071 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa2e67e-ac11-4d3c-9494-2e0bf79f005a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.071 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[29a73c09-51da-44cf-a891-deef8ccf7f9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 systemd[1]: Started Virtual Machine qemu-13-instance-00000015.
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.084 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[83a39385-3298-430d-b3a5-9531f2c76bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.092 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cc85b421-d2eb-4e3c-adc6-15d9b4d4db2d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 systemd-udevd[216135]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.1070] device (tap4433db17-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.1083] device (tap4433db17-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.123 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc8b18a-c487-4515-abaa-6f899f6e1eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.1356] manager: (tap60eba810-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.133 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[40544ea3-3d31-41a3-94e6-0c38a7658d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.164 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2e470c44-5e80-40d5-b188-ed0050a0c0ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.169 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fd13c9-3d38-46b2-81a5-2acecad9204d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.1901] device (tap60eba810-d0): carrier: link connected
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.196 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[114ae310-8018-4a02-930c-fff0d46250c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.208 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f7366857-b46d-4d0e-b3a2-79da85eb9e9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60eba810-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:47:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 390753, 'reachable_time': 24701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216169, 'error': None, 'target': 'ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.225 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b7637d88-951d-463d-8599-6efda2db31a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:479b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 390753, 'tstamp': 390753}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216171, 'error': None, 'target': 'ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.238 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f085eb3f-aaca-47de-9d2e-21f05a837331]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60eba810-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:47:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 390753, 'reachable_time': 24701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216179, 'error': None, 'target': 'ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.264 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1c06de-6bb7-4d03-8b52-34fe9190c24e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.296 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765376.2958484, 362db3ae-3984-411e-994b-55924dc0c06f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.296 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Started (Lifecycle Event)#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.326 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4178e7-79e2-4b62-bade-11a4673c6fec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.328 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60eba810-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.328 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.328 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60eba810-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.331 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:36 np0005601977 NetworkManager[55565]: <info>  [1769765376.3315] manager: (tap60eba810-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 30 04:29:36 np0005601977 kernel: tap60eba810-d0: entered promiscuous mode
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.336 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.337 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60eba810-d0, col_values=(('external_ids', {'iface-id': '34137aa8-3b0b-4b19-b520-be2930318935'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:36 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:36Z|00154|binding|INFO|Releasing lport 34137aa8-3b0b-4b19-b520-be2930318935 from this chassis (sb_readonly=0)
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.346 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/60eba810-de66-4c2e-8c3c-70333d77e79c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/60eba810-de66-4c2e-8c3c-70333d77e79c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.347 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[15462065-9af2-4d96-a7aa-eb28eeee1070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.347 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.348 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-60eba810-de66-4c2e-8c3c-70333d77e79c
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/60eba810-de66-4c2e-8c3c-70333d77e79c.pid.haproxy
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 60eba810-de66-4c2e-8c3c-70333d77e79c
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.348 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:36.349 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c', 'env', 'PROCESS_TAG=haproxy-60eba810-de66-4c2e-8c3c-70333d77e79c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/60eba810-de66-4c2e-8c3c-70333d77e79c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.352 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765376.2982037, 362db3ae-3984-411e-994b-55924dc0c06f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.352 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.407 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.412 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.489 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:36 np0005601977 podman[216215]: 2026-01-30 09:29:36.672596466 +0000 UTC m=+0.043478003 container create 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 30 04:29:36 np0005601977 systemd[1]: Started libpod-conmon-0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4.scope.
Jan 30 04:29:36 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:29:36 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08b0058770c6cded8151a7cda7fcd7f5137af9ea74c2cd106c44d9e57dba828c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:29:36 np0005601977 podman[216215]: 2026-01-30 09:29:36.744677575 +0000 UTC m=+0.115559142 container init 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:29:36 np0005601977 podman[216215]: 2026-01-30 09:29:36.651506193 +0000 UTC m=+0.022387740 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:29:36 np0005601977 podman[216215]: 2026-01-30 09:29:36.749605016 +0000 UTC m=+0.120486553 container start 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:29:36 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [NOTICE]   (216234) : New worker (216236) forked
Jan 30 04:29:36 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [NOTICE]   (216234) : Loading success.
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.938 183134 DEBUG nova.network.neutron [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updated VIF entry in instance network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.939 183134 DEBUG nova.network.neutron [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updating instance_info_cache with network_info: [{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:36 np0005601977 nova_compute[183130]: 2026-01-30 09:29:36.985 183134 DEBUG oslo_concurrency.lockutils [req-f348c4fc-c6a1-4847-9463-804d93edb0b3 req-9beea3d1-3b21-45a2-9b21-29d3d20019d0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.281 183134 DEBUG nova.compute.manager [req-39898869-e057-4fcb-b519-17e68a52d9ce req-88ad7495-b65c-455a-8e73-5692f5681a24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.281 183134 DEBUG oslo_concurrency.lockutils [req-39898869-e057-4fcb-b519-17e68a52d9ce req-88ad7495-b65c-455a-8e73-5692f5681a24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.282 183134 DEBUG oslo_concurrency.lockutils [req-39898869-e057-4fcb-b519-17e68a52d9ce req-88ad7495-b65c-455a-8e73-5692f5681a24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.282 183134 DEBUG oslo_concurrency.lockutils [req-39898869-e057-4fcb-b519-17e68a52d9ce req-88ad7495-b65c-455a-8e73-5692f5681a24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.283 183134 DEBUG nova.compute.manager [req-39898869-e057-4fcb-b519-17e68a52d9ce req-88ad7495-b65c-455a-8e73-5692f5681a24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Processing event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.284 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.292 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.293 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765377.2915938, 362db3ae-3984-411e-994b-55924dc0c06f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.294 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.302 183134 INFO nova.virt.libvirt.driver [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Instance spawned successfully.#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.303 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.344 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.347 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.355 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.355 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.355 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.356 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.356 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.356 183134 DEBUG nova.virt.libvirt.driver [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.387 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.424 183134 INFO nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Took 13.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.425 183134 DEBUG nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.429 183134 DEBUG nova.compute.manager [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-changed-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.429 183134 DEBUG nova.compute.manager [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing instance network info cache due to event network-changed-695209cb-0de3-443c-9e7f-c65894975f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.430 183134 DEBUG oslo_concurrency.lockutils [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.430 183134 DEBUG oslo_concurrency.lockutils [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.430 183134 DEBUG nova.network.neutron [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.510 183134 INFO nova.compute.manager [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Took 14.54 seconds to build instance.#033[00m
Jan 30 04:29:37 np0005601977 nova_compute[183130]: 2026-01-30 09:29:37.529 183134 DEBUG oslo_concurrency.lockutils [None req-35009c78-5588-4dde-b953-67f57575f5c3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.527 183134 DEBUG nova.compute.manager [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.528 183134 DEBUG oslo_concurrency.lockutils [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.528 183134 DEBUG oslo_concurrency.lockutils [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.528 183134 DEBUG oslo_concurrency.lockutils [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.528 183134 DEBUG nova.compute.manager [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] No waiting events found dispatching network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:39 np0005601977 nova_compute[183130]: 2026-01-30 09:29:39.529 183134 WARNING nova.compute.manager [req-223cc8a9-a8b1-4c87-8a07-d66939ea4029 req-a78fcc3b-b55c-43dd-b9b7-e1b36f5355e4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received unexpected event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:40 np0005601977 nova_compute[183130]: 2026-01-30 09:29:40.218 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:40 np0005601977 nova_compute[183130]: 2026-01-30 09:29:40.432 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:40Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:ea:ed 10.100.0.8
Jan 30 04:29:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:40Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:ea:ed 10.100.0.8
Jan 30 04:29:40 np0005601977 nova_compute[183130]: 2026-01-30 09:29:40.904 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:40 np0005601977 nova_compute[183130]: 2026-01-30 09:29:40.904 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:40 np0005601977 nova_compute[183130]: 2026-01-30 09:29:40.905 183134 DEBUG nova.objects.instance [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'flavor' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.512 183134 DEBUG nova.network.neutron [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Port e680749e-01e2-462e-8755-8b4f01e1272e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.514 183134 DEBUG nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6r9pss4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22bc0323-ee7d-4b6e-992e-a2410bf240e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 30 04:29:41 np0005601977 systemd[1]: Starting libvirt proxy daemon...
Jan 30 04:29:41 np0005601977 systemd[1]: Started libvirt proxy daemon.
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.704 183134 DEBUG nova.objects.instance [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_requests' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.718 183134 DEBUG nova.network.neutron [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:29:41 np0005601977 kernel: tape680749e-01: entered promiscuous mode
Jan 30 04:29:41 np0005601977 NetworkManager[55565]: <info>  [1769765381.8326] manager: (tape680749e-01): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.834 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:41Z|00155|binding|INFO|Claiming lport e680749e-01e2-462e-8755-8b4f01e1272e for this additional chassis.
Jan 30 04:29:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:41Z|00156|binding|INFO|e680749e-01e2-462e-8755-8b4f01e1272e: Claiming fa:16:3e:92:37:22 10.100.0.5
Jan 30 04:29:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:41Z|00157|binding|INFO|Claiming lport 40336582-d4ab-46e5-9089-cf09f796f51f for this additional chassis.
Jan 30 04:29:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:41Z|00158|binding|INFO|40336582-d4ab-46e5-9089-cf09f796f51f: Claiming fa:16:3e:b4:cf:3e 19.80.0.151
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.868 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.874 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:41Z|00159|binding|INFO|Setting lport e680749e-01e2-462e-8755-8b4f01e1272e ovn-installed in OVS
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.878 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:41 np0005601977 systemd-machined[154431]: New machine qemu-14-instance-00000010.
Jan 30 04:29:41 np0005601977 systemd[1]: Started Virtual Machine qemu-14-instance-00000010.
Jan 30 04:29:41 np0005601977 systemd-udevd[216329]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:41 np0005601977 NetworkManager[55565]: <info>  [1769765381.9256] device (tape680749e-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:29:41 np0005601977 NetworkManager[55565]: <info>  [1769765381.9264] device (tape680749e-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.931 183134 DEBUG nova.network.neutron [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updated VIF entry in instance network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.931 183134 DEBUG nova.network.neutron [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:41 np0005601977 podman[216301]: 2026-01-30 09:29:41.947381768 +0000 UTC m=+0.065565615 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 30 04:29:41 np0005601977 nova_compute[183130]: 2026-01-30 09:29:41.954 183134 DEBUG oslo_concurrency.lockutils [req-9c6f1d42-0a92-406f-9e70-3cfe9005e62a req-be13545e-4488-4a33-bb61-bab1b0138807 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:41 np0005601977 podman[216299]: 2026-01-30 09:29:41.956138048 +0000 UTC m=+0.088199541 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:29:42 np0005601977 nova_compute[183130]: 2026-01-30 09:29:42.409 183134 DEBUG nova.policy [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:29:42 np0005601977 nova_compute[183130]: 2026-01-30 09:29:42.540 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765382.5405102, 22bc0323-ee7d-4b6e-992e-a2410bf240e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:42 np0005601977 nova_compute[183130]: 2026-01-30 09:29:42.541 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] VM Started (Lifecycle Event)#033[00m
Jan 30 04:29:42 np0005601977 nova_compute[183130]: 2026-01-30 09:29:42.566 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:43 np0005601977 nova_compute[183130]: 2026-01-30 09:29:43.368 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765383.3684359, 22bc0323-ee7d-4b6e-992e-a2410bf240e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:43 np0005601977 nova_compute[183130]: 2026-01-30 09:29:43.369 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:29:43 np0005601977 nova_compute[183130]: 2026-01-30 09:29:43.420 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:43 np0005601977 nova_compute[183130]: 2026-01-30 09:29:43.423 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:43 np0005601977 nova_compute[183130]: 2026-01-30 09:29:43.472 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.498 183134 DEBUG nova.network.neutron [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Successfully created port: b82506d0-07bd-485e-9c87-976a162f45f9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.756 183134 DEBUG nova.compute.manager [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-changed-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.757 183134 DEBUG nova.compute.manager [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing instance network info cache due to event network-changed-4433db17-a607-4a44-9251-c5e602dc0576. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.757 183134 DEBUG oslo_concurrency.lockutils [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.758 183134 DEBUG oslo_concurrency.lockutils [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:44 np0005601977 nova_compute[183130]: 2026-01-30 09:29:44.758 183134 DEBUG nova.network.neutron [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:45 np0005601977 nova_compute[183130]: 2026-01-30 09:29:45.259 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:45 np0005601977 nova_compute[183130]: 2026-01-30 09:29:45.434 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:45Z|00160|binding|INFO|Changing chassis for lport e680749e-01e2-462e-8755-8b4f01e1272e from d14b9ab5-bf6e-4142-ad45-b863645e483d to 9be64184-856f-4986-a80e-9403fa35a6a5.
Jan 30 04:29:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:45Z|00161|binding|INFO|e680749e-01e2-462e-8755-8b4f01e1272e: Claiming fa:16:3e:92:37:22 10.100.0.5
Jan 30 04:29:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:45Z|00162|binding|INFO|Changing chassis for lport 40336582-d4ab-46e5-9089-cf09f796f51f from d14b9ab5-bf6e-4142-ad45-b863645e483d to 9be64184-856f-4986-a80e-9403fa35a6a5.
Jan 30 04:29:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:45Z|00163|binding|INFO|40336582-d4ab-46e5-9089-cf09f796f51f: Claiming fa:16:3e:b4:cf:3e 19.80.0.151
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.958 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:37:22 10.100.0.5'], port_security=['fa:16:3e:92:37:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-547456304', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-547456304', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '9', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=e680749e-01e2-462e-8755-8b4f01e1272e) old=Port_Binding(additional_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.960 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:cf:3e 19.80.0.151'], port_security=['fa:16:3e:b4:cf:3e 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['e680749e-01e2-462e-8755-8b4f01e1272e'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-367508310', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01f97378-9667-4aa0-9b75-db68873adbbb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-367508310', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ca9da3af-412a-43bc-885e-95d28caf9a34, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40336582-d4ab-46e5-9089-cf09f796f51f) old=Port_Binding(additional_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.961 104706 INFO neutron.agent.ovn.metadata.agent [-] Port e680749e-01e2-462e-8755-8b4f01e1272e in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 bound to our chassis#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.963 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.974 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b887605c-4575-4297-9634-6ed00bac3563]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.993 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[55866d0f-aefd-44b9-b8ef-0c9115dce0f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:45.995 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[aecfd51c-c8d9-498d-baac-d2a072794b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.014 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f23a2d-d060-41ca-b211-e3780c48b8e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.026 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e12308-6a07-4ddd-a89c-09d837bd9884]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 9, 'rx_bytes': 1372, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 9, 'rx_bytes': 1372, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216376, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.044 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0a13fe77-a6f1-4392-9115-ffcc6a7761e8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366992, 'tstamp': 366992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216377, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366995, 'tstamp': 366995}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216377, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.046 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.049 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.049 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e3ea2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.049 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.049 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e3ea2-50, col_values=(('external_ids', {'iface-id': '15b4d9a6-bad1-4bf8-a262-02e27eb8ea93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.050 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.052 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 40336582-d4ab-46e5-9089-cf09f796f51f in datapath 01f97378-9667-4aa0-9b75-db68873adbbb bound to our chassis#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.054 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01f97378-9667-4aa0-9b75-db68873adbbb#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.061 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d17d93ab-c80f-4666-851d-b25b51d3fbcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.062 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01f97378-91 in ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.064 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01f97378-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.064 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb10d87-28a9-433c-94a3-b9d6c9cc2f29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.065 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf89fa0-764b-46b0-9e6a-83ee28cbea33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.076 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[354c2558-3b8f-402f-bb99-5a2d86145592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.093 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[128b3d7c-b966-43aa-8c94-bee273ca6818]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.115 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e20d7ecf-b63a-49a8-bf85-15d98dd84b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.120 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[db3399a6-a20f-45f4-9982-cb44463a5d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 NetworkManager[55565]: <info>  [1769765386.1223] manager: (tap01f97378-90): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Jan 30 04:29:46 np0005601977 systemd-udevd[216385]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.144 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0062f162-2da3-4176-af57-60f038661176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.148 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[7aaddc6f-fa2a-4d9c-846c-6cc9b87493f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 NetworkManager[55565]: <info>  [1769765386.1712] device (tap01f97378-90): carrier: link connected
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.174 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a205e932-0d74-46b9-ab4f-049d5042d183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.186 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6248cc-a333-47fb-a1e6-fbc53cd10a8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01f97378-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:f5:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391751, 'reachable_time': 36726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216404, 'error': None, 'target': 'ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.199 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d30d9ec2-509b-4b2a-9104-cbdf7c37f9ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:f5a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391751, 'tstamp': 391751}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216405, 'error': None, 'target': 'ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.212 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a90b7039-7fd6-4ad5-a3b8-bb6df00d4b97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01f97378-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:f5:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391751, 'reachable_time': 36726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216406, 'error': None, 'target': 'ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.233 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[629a8542-53f0-4176-9f3f-3c89bc9a57ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.285 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[74141c3e-72e3-4f66-a33d-be66547d6c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.286 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01f97378-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.286 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.287 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01f97378-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 kernel: tap01f97378-90: entered promiscuous mode
Jan 30 04:29:46 np0005601977 NetworkManager[55565]: <info>  [1769765386.2900] manager: (tap01f97378-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.289 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.291 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.295 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01f97378-90, col_values=(('external_ids', {'iface-id': '66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.297 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:46Z|00164|binding|INFO|Releasing lport 66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b from this chassis (sb_readonly=0)
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.298 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.298 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01f97378-9667-4aa0-9b75-db68873adbbb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01f97378-9667-4aa0-9b75-db68873adbbb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.299 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a65ed680-31f6-4012-911b-0c8e50b7e15e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.300 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-01f97378-9667-4aa0-9b75-db68873adbbb
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/01f97378-9667-4aa0-9b75-db68873adbbb.pid.haproxy
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 01f97378-9667-4aa0-9b75-db68873adbbb
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:29:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:46.301 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb', 'env', 'PROCESS_TAG=haproxy-01f97378-9667-4aa0-9b75-db68873adbbb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01f97378-9667-4aa0-9b75-db68873adbbb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.302 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.417 183134 DEBUG nova.network.neutron [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Successfully updated port: b82506d0-07bd-485e-9c87-976a162f45f9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.433 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.433 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.433 183134 DEBUG nova.network.neutron [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:46 np0005601977 podman[216439]: 2026-01-30 09:29:46.617623217 +0000 UTC m=+0.041060004 container create adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:29:46 np0005601977 systemd[1]: Started libpod-conmon-adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13.scope.
Jan 30 04:29:46 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:29:46 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da8690ca0655ddc1bfef94f414d030808ee7e2e809b48b427b8c6b1b61990be0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:29:46 np0005601977 podman[216439]: 2026-01-30 09:29:46.6870161 +0000 UTC m=+0.110452927 container init adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:29:46 np0005601977 podman[216439]: 2026-01-30 09:29:46.691026784 +0000 UTC m=+0.114463581 container start adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:29:46 np0005601977 podman[216439]: 2026-01-30 09:29:46.596478483 +0000 UTC m=+0.019915290 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:29:46 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [NOTICE]   (216458) : New worker (216460) forked
Jan 30 04:29:46 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [NOTICE]   (216458) : Loading success.
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.790 183134 DEBUG nova.network.neutron [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updated VIF entry in instance network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.790 183134 DEBUG nova.network.neutron [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updating instance_info_cache with network_info: [{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:46 np0005601977 nova_compute[183130]: 2026-01-30 09:29:46.817 183134 DEBUG oslo_concurrency.lockutils [req-cab1d906-781e-4cfc-acd2-17d697fd9441 req-2ec00398-70d9-4f5d-9a91-17277e8da729 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.140 183134 DEBUG nova.network.neutron [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.171 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.174 183134 DEBUG nova.virt.libvirt.vif [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.175 183134 DEBUG nova.network.os_vif_util [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.175 183134 DEBUG nova.network.os_vif_util [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.176 183134 DEBUG os_vif [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.176 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.177 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.177 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.180 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.180 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb82506d0-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.180 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb82506d0-07, col_values=(('external_ids', {'iface-id': 'b82506d0-07bd-485e-9c87-976a162f45f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:61:f9', 'vm-uuid': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.182 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.1833] manager: (tapb82506d0-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.183 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.187 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.188 183134 INFO os_vif [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07')#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.188 183134 DEBUG nova.virt.libvirt.vif [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.188 183134 DEBUG nova.network.os_vif_util [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.189 183134 DEBUG nova.network.os_vif_util [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.192 183134 DEBUG nova.virt.libvirt.guest [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] attach device xml: <interface type="ethernet">
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:13:61:f9"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <target dev="tapb82506d0-07"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:29:48 np0005601977 nova_compute[183130]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 30 04:29:48 np0005601977 kernel: tapb82506d0-07: entered promiscuous mode
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.2059] manager: (tapb82506d0-07): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Jan 30 04:29:48 np0005601977 systemd-udevd[216395]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:29:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:48Z|00165|binding|INFO|Claiming lport b82506d0-07bd-485e-9c87-976a162f45f9 for this chassis.
Jan 30 04:29:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:48Z|00166|binding|INFO|b82506d0-07bd-485e-9c87-976a162f45f9: Claiming fa:16:3e:13:61:f9 10.100.0.26
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.210 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.220 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:61:f9 10.100.0.26'], port_security=['fa:16:3e:13:61:f9 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '947fe520-942d-4287-9e9d-738b24a6a1e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e23d140-fb29-4586-9972-38fb3b11b880, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b82506d0-07bd-485e-9c87-976a162f45f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.2220] device (tapb82506d0-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.2229] device (tapb82506d0-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.224 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b82506d0-07bd-485e-9c87-976a162f45f9 in datapath d7c17727-fd17-47bd-95c1-452f5edc7e25 bound to our chassis#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.227 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7c17727-fd17-47bd-95c1-452f5edc7e25#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.228 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:48Z|00167|binding|INFO|Setting lport b82506d0-07bd-485e-9c87-976a162f45f9 ovn-installed in OVS
Jan 30 04:29:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:48Z|00168|binding|INFO|Setting lport b82506d0-07bd-485e-9c87-976a162f45f9 up in Southbound
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.237 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e72974db-a820-4262-aa37-8d3f2483d183]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.238 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7c17727-f1 in ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.240 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7c17727-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.240 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[df13e255-e613-40c4-9d49-d44d7326ebe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.240 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbdfe12-3ebc-42b8-a102-654c73df8cae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.249 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ccb6fe-6f7c-4128-901b-a3681c54f6e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.272 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e98bfc1a-8053-47b4-8513-f0cd66095884]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.291 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[39965bf7-2d22-451b-a639-d3045dbfce68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.296 183134 DEBUG nova.virt.libvirt.driver [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.296 183134 DEBUG nova.virt.libvirt.driver [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.297 183134 DEBUG nova.virt.libvirt.driver [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:57:c6:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.297 183134 DEBUG nova.virt.libvirt.driver [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:13:61:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.3026] manager: (tapd7c17727-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.301 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[76364099-6e1e-4c68-b21a-a3eb6b6bd9e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.324 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb66807-8174-4aee-93a4-ea3e106e9141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.328 183134 DEBUG nova.virt.libvirt.guest [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:48</nova:creationTime>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:48 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    <nova:port uuid="b82506d0-07bd-485e-9c87-976a162f45f9">
Jan 30 04:29:48 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:48 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:48 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:48 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.329 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9cae1ec9-e607-49a6-aff7-4fffabdb0d5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.3457] device (tapd7c17727-f0): carrier: link connected
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.349 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[cf679c40-3dec-477c-a4d8-7d048cbb5c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.357 183134 DEBUG oslo_concurrency.lockutils [None req-52850535-10bb-4c6c-bf9b-c5925a9bb989 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.366 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4969dc72-7e5e-4fa8-9588-3cf15ff74965]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7c17727-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:ac:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391968, 'reachable_time': 40334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216500, 'error': None, 'target': 'ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.380 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2d22a44c-4324-43c3-82e0-1a48b59e9431]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:ac77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391968, 'tstamp': 391968}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216501, 'error': None, 'target': 'ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.392 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[175f71b5-345e-4fe4-be53-ce4e433b1d76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7c17727-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:ac:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391968, 'reachable_time': 40334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216502, 'error': None, 'target': 'ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.416 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8edecd13-4f46-40d4-a83b-e45aec3cdaa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.455 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[92f7e256-ea0b-4c86-a445-6c4f5332b05d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.456 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7c17727-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.456 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.457 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7c17727-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.459 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 NetworkManager[55565]: <info>  [1769765388.4599] manager: (tapd7c17727-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 30 04:29:48 np0005601977 kernel: tapd7c17727-f0: entered promiscuous mode
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.461 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.462 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7c17727-f0, col_values=(('external_ids', {'iface-id': '9c96ce1f-5385-4f8c-93bb-17a210e22b70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.463 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:48Z|00169|binding|INFO|Releasing lport 9c96ce1f-5385-4f8c-93bb-17a210e22b70 from this chassis (sb_readonly=0)
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.468 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.469 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7c17727-fd17-47bd-95c1-452f5edc7e25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7c17727-fd17-47bd-95c1-452f5edc7e25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.469 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f9651d1f-392b-41a4-9f9a-3c73ce44ad88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.470 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-d7c17727-fd17-47bd-95c1-452f5edc7e25
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/d7c17727-fd17-47bd-95c1-452f5edc7e25.pid.haproxy
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID d7c17727-fd17-47bd-95c1-452f5edc7e25
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:29:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:48.470 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'env', 'PROCESS_TAG=haproxy-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7c17727-fd17-47bd-95c1-452f5edc7e25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:29:48 np0005601977 podman[216534]: 2026-01-30 09:29:48.772553144 +0000 UTC m=+0.046599083 container create c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:29:48 np0005601977 systemd[1]: Started libpod-conmon-c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587.scope.
Jan 30 04:29:48 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:29:48 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cfc0d4511764c1380c800a45012256dc3e3e30dd45ede8254a89c70cb9e7ca7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:29:48 np0005601977 podman[216534]: 2026-01-30 09:29:48.750005269 +0000 UTC m=+0.024051238 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:29:48 np0005601977 podman[216534]: 2026-01-30 09:29:48.859389024 +0000 UTC m=+0.133434983 container init c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:29:48 np0005601977 podman[216534]: 2026-01-30 09:29:48.863509722 +0000 UTC m=+0.137555661 container start c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:29:48 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [NOTICE]   (216553) : New worker (216555) forked
Jan 30 04:29:48 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [NOTICE]   (216553) : Loading success.
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.975 183134 DEBUG nova.compute.manager [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-changed-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.976 183134 DEBUG nova.compute.manager [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing instance network info cache due to event network-changed-b82506d0-07bd-485e-9c87-976a162f45f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.976 183134 DEBUG oslo_concurrency.lockutils [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.976 183134 DEBUG oslo_concurrency.lockutils [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:48 np0005601977 nova_compute[183130]: 2026-01-30 09:29:48.976 183134 DEBUG nova.network.neutron [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing network info cache for port b82506d0-07bd-485e-9c87-976a162f45f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.225 183134 DEBUG nova.compute.manager [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.225 183134 DEBUG oslo_concurrency.lockutils [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.226 183134 DEBUG oslo_concurrency.lockutils [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.227 183134 DEBUG oslo_concurrency.lockutils [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.227 183134 DEBUG nova.compute.manager [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:49 np0005601977 nova_compute[183130]: 2026-01-30 09:29:49.227 183134 WARNING nova.compute.manager [req-3596b9c5-2f1b-4a51-a7ed-502216c03d49 req-a9b2cf7c-0553-47ee-8228-ace94b60a5fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:49Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:24:1c 10.100.0.14
Jan 30 04:29:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:49Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:24:1c 10.100.0.14
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.264 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.403 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-b82506d0-07bd-485e-9c87-976a162f45f9" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.404 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-b82506d0-07bd-485e-9c87-976a162f45f9" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.419 183134 DEBUG nova.objects.instance [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'flavor' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.429 183134 DEBUG nova.network.neutron [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updated VIF entry in instance network info cache for port b82506d0-07bd-485e-9c87-976a162f45f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.429 183134 DEBUG nova.network.neutron [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.454 183134 DEBUG nova.virt.libvirt.vif [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.455 183134 DEBUG nova.network.os_vif_util [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.455 183134 DEBUG nova.network.os_vif_util [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.458 183134 DEBUG oslo_concurrency.lockutils [req-a1c6cbfa-424e-40d9-8728-340b5116e03c req-69e4a351-7a29-4b70-9eee-e1775987a8f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:50 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:50Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:61:f9 10.100.0.26
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.461 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:50 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:50Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:61:f9 10.100.0.26
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.462 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.464 183134 DEBUG nova.virt.libvirt.driver [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Attempting to detach device tapb82506d0-07 from instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.465 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] detach device xml: <interface type="ethernet">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:13:61:f9"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <target dev="tapb82506d0-07"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.471 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.475 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <name>instance-00000013</name>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <uuid>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</uuid>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:48</nova:creationTime>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:port uuid="b82506d0-07bd-485e-9c87-976a162f45f9">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='serial'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='uuid'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk' index='2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config' index='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:57:c6:12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='tap2011cfc4-30'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:13:61:f9'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='tapb82506d0-07'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='net1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/1'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c1,c133</label>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c1,c133</imagelabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.476 183134 INFO nova.virt.libvirt.driver [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully detached device tapb82506d0-07 from instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 from the persistent domain config.#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.476 183134 DEBUG nova.virt.libvirt.driver [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] (1/8): Attempting to detach device tapb82506d0-07 with device alias net1 from instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.476 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] detach device xml: <interface type="ethernet">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:13:61:f9"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <target dev="tapb82506d0-07"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 30 04:29:50 np0005601977 kernel: tapb82506d0-07 (unregistering): left promiscuous mode
Jan 30 04:29:50 np0005601977 NetworkManager[55565]: <info>  [1769765390.5856] device (tapb82506d0-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:29:50 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:50Z|00170|binding|INFO|Releasing lport b82506d0-07bd-485e-9c87-976a162f45f9 from this chassis (sb_readonly=0)
Jan 30 04:29:50 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:50Z|00171|binding|INFO|Setting lport b82506d0-07bd-485e-9c87-976a162f45f9 down in Southbound
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.594 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:50Z|00172|binding|INFO|Removing iface tapb82506d0-07 ovn-installed in OVS
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.599 183134 DEBUG nova.virt.libvirt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Received event <DeviceRemovedEvent: 1769765390.597984, 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 30 04:29:50 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:50.602 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:61:f9 10.100.0.26'], port_security=['fa:16:3e:13:61:f9 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '947fe520-942d-4287-9e9d-738b24a6a1e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e23d140-fb29-4586-9972-38fb3b11b880, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b82506d0-07bd-485e-9c87-976a162f45f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.602 183134 DEBUG nova.virt.libvirt.driver [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Start waiting for the detach event from libvirt for device tapb82506d0-07 with device alias net1 for instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.603 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:50 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:50.604 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b82506d0-07bd-485e-9c87-976a162f45f9 in datapath d7c17727-fd17-47bd-95c1-452f5edc7e25 unbound from our chassis#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.604 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:50.606 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7c17727-fd17-47bd-95c1-452f5edc7e25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:29:50 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:50.607 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[68552b60-7549-4bcc-8fe4-132fd28185f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:50 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:50.609 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25 namespace which is not needed anymore#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.616 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <name>instance-00000013</name>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <uuid>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</uuid>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:48</nova:creationTime>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:port uuid="b82506d0-07bd-485e-9c87-976a162f45f9">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='serial'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='uuid'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk' index='2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config' index='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:57:c6:12'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target dev='tap2011cfc4-30'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/1'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c1,c133</label>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c1,c133</imagelabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.617 183134 INFO nova.virt.libvirt.driver [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully detached device tapb82506d0-07 from instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 from the live domain config.#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.618 183134 DEBUG nova.virt.libvirt.vif [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.619 183134 DEBUG nova.network.os_vif_util [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.620 183134 DEBUG nova.network.os_vif_util [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.621 183134 DEBUG os_vif [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.626 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb82506d0-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.628 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.630 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.642 183134 INFO os_vif [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07')#033[00m
Jan 30 04:29:50 np0005601977 nova_compute[183130]: 2026-01-30 09:29:50.643 183134 DEBUG nova.virt.libvirt.guest [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:50</nova:creationTime>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:50 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:50 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:50 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:29:50 np0005601977 podman[216567]: 2026-01-30 09:29:50.677430506 +0000 UTC m=+0.066193793 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:29:50 np0005601977 podman[216565]: 2026-01-30 09:29:50.711012695 +0000 UTC m=+0.101619644 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 30 04:29:50 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [NOTICE]   (216553) : haproxy version is 2.8.14-c23fe91
Jan 30 04:29:50 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [NOTICE]   (216553) : path to executable is /usr/sbin/haproxy
Jan 30 04:29:50 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [WARNING]  (216553) : Exiting Master process...
Jan 30 04:29:50 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [ALERT]    (216553) : Current worker (216555) exited with code 143 (Terminated)
Jan 30 04:29:50 np0005601977 neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25[216549]: [WARNING]  (216553) : All workers exited. Exiting... (0)
Jan 30 04:29:50 np0005601977 systemd[1]: libpod-c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587.scope: Deactivated successfully.
Jan 30 04:29:50 np0005601977 podman[216623]: 2026-01-30 09:29:50.730478091 +0000 UTC m=+0.045974074 container died c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:29:50 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587-userdata-shm.mount: Deactivated successfully.
Jan 30 04:29:50 np0005601977 systemd[1]: var-lib-containers-storage-overlay-9cfc0d4511764c1380c800a45012256dc3e3e30dd45ede8254a89c70cb9e7ca7-merged.mount: Deactivated successfully.
Jan 30 04:29:50 np0005601977 podman[216623]: 2026-01-30 09:29:50.760984183 +0000 UTC m=+0.076480156 container cleanup c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 30 04:29:50 np0005601977 systemd[1]: libpod-conmon-c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587.scope: Deactivated successfully.
Jan 30 04:29:51 np0005601977 podman[216654]: 2026-01-30 09:29:51.029462073 +0000 UTC m=+0.250580370 container remove c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.033 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[645dcc38-bdde-4f1f-b2e7-fb89f484ddd0]: (4, ('Fri Jan 30 09:29:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25 (c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587)\nc40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587\nFri Jan 30 09:29:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25 (c40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587)\nc40496eab931749014037626105e37176067b1a26238cdf0dc7a74caeeb51587\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.035 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d59dc367-7e44-439d-b58f-190ea3e695ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.036 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7c17727-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.073 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:51 np0005601977 kernel: tapd7c17727-f0: left promiscuous mode
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.079 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.082 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[554061eb-1d78-40d0-96f4-f186b695a103]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.102 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[94356e5f-a924-4c5a-a233-1c3c64474649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.104 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea88106-79ab-45f8-b617-2e65a1f0c5c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.119 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6b036a66-6e04-4fd0-ac0a-4b9ac64c5f96]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391962, 'reachable_time': 38399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216670, 'error': None, 'target': 'ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.121 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7c17727-fd17-47bd-95c1-452f5edc7e25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:29:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:51.121 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[97a487ff-0053-42f2-8ea2-9e9e1f11acb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:51 np0005601977 systemd[1]: run-netns-ovnmeta\x2dd7c17727\x2dfd17\x2d47bd\x2d95c1\x2d452f5edc7e25.mount: Deactivated successfully.
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.260 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.261 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.262 183134 DEBUG nova.network.neutron [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.312 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.313 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.313 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.313 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.314 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.314 183134 WARNING nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.314 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-unplugged-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.315 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.315 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.316 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.316 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-unplugged-b82506d0-07bd-485e-9c87-976a162f45f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.316 183134 WARNING nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-unplugged-b82506d0-07bd-485e-9c87-976a162f45f9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.317 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.317 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.317 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.318 183134 DEBUG oslo_concurrency.lockutils [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.318 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.318 183134 WARNING nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-plugged-b82506d0-07bd-485e-9c87-976a162f45f9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.318 183134 DEBUG nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-deleted-b82506d0-07bd-485e-9c87-976a162f45f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.319 183134 INFO nova.compute.manager [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Neutron deleted interface b82506d0-07bd-485e-9c87-976a162f45f9; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.319 183134 DEBUG nova.network.neutron [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.344 183134 DEBUG nova.objects.instance [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lazy-loading 'system_metadata' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.395 183134 DEBUG nova.objects.instance [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lazy-loading 'flavor' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.414 183134 DEBUG nova.virt.libvirt.vif [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.414 183134 DEBUG nova.network.os_vif_util [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.414 183134 DEBUG nova.network.os_vif_util [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.417 183134 DEBUG nova.virt.libvirt.guest [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.421 183134 DEBUG nova.virt.libvirt.guest [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <name>instance-00000013</name>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <uuid>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</uuid>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:50</nova:creationTime>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='serial'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='uuid'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk' index='2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config' index='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:57:c6:12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='tap2011cfc4-30'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/1'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c1,c133</label>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c1,c133</imagelabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.421 183134 DEBUG nova.virt.libvirt.guest [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.424 183134 DEBUG nova.virt.libvirt.guest [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:13:61:f9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb82506d0-07"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <name>instance-00000013</name>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <uuid>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</uuid>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:50</nova:creationTime>
Jan 30 04:29:51 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='serial'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='uuid'>65e07f9f-264b-4e0d-9aa7-f87ebaf84705</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk' index='2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/disk.config' index='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:57:c6:12'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target dev='tap2011cfc4-30'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/1'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <source path='/dev/pts/1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705/console.log' append='off'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c1,c133</label>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c1,c133</imagelabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.425 183134 WARNING nova.virt.libvirt.driver [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Detaching interface fa:16:3e:13:61:f9 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapb82506d0-07' not found.#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.425 183134 DEBUG nova.virt.libvirt.vif [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.426 183134 DEBUG nova.network.os_vif_util [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converting VIF {"id": "b82506d0-07bd-485e-9c87-976a162f45f9", "address": "fa:16:3e:13:61:f9", "network": {"id": "d7c17727-fd17-47bd-95c1-452f5edc7e25", "bridge": "br-int", "label": "tempest-network-smoke--1604540109", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb82506d0-07", "ovs_interfaceid": "b82506d0-07bd-485e-9c87-976a162f45f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.426 183134 DEBUG nova.network.os_vif_util [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.427 183134 DEBUG os_vif [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.428 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.428 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb82506d0-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.429 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.430 183134 INFO os_vif [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:61:f9,bridge_name='br-int',has_traffic_filtering=True,id=b82506d0-07bd-485e-9c87-976a162f45f9,network=Network(d7c17727-fd17-47bd-95c1-452f5edc7e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb82506d0-07')#033[00m
Jan 30 04:29:51 np0005601977 nova_compute[183130]: 2026-01-30 09:29:51.431 183134 DEBUG nova.virt.libvirt.guest [req-11bdc879-a897-479d-bd53-74bc9a2d4145 req-ae8e6cda-4c6d-4bf0-8eb4-acaa98e8f59b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-1268443133</nova:name>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:29:51</nova:creationTime>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    <nova:port uuid="2011cfc4-3053-450f-9a91-99928686bc26">
Jan 30 04:29:51 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:29:51 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:29:51 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:29:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:52Z|00173|binding|INFO|Releasing lport 66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b from this chassis (sb_readonly=0)
Jan 30 04:29:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:52Z|00174|binding|INFO|Releasing lport d9b977ba-c89f-4674-bf61-3e41f1337fa6 from this chassis (sb_readonly=0)
Jan 30 04:29:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:52Z|00175|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:29:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:52Z|00176|binding|INFO|Releasing lport 34137aa8-3b0b-4b19-b520-be2930318935 from this chassis (sb_readonly=0)
Jan 30 04:29:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:52Z|00177|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:29:52 np0005601977 nova_compute[183130]: 2026-01-30 09:29:52.445 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:52 np0005601977 nova_compute[183130]: 2026-01-30 09:29:52.681 183134 INFO nova.network.neutron [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Port b82506d0-07bd-485e-9c87-976a162f45f9 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 30 04:29:52 np0005601977 nova_compute[183130]: 2026-01-30 09:29:52.682 183134 DEBUG nova.network.neutron [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:52 np0005601977 nova_compute[183130]: 2026-01-30 09:29:52.760 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:52 np0005601977 nova_compute[183130]: 2026-01-30 09:29:52.817 183134 DEBUG oslo_concurrency.lockutils [None req-4578c8b4-0fa6-4fbf-9741-73516023579e a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-65e07f9f-264b-4e0d-9aa7-f87ebaf84705-b82506d0-07bd-485e-9c87-976a162f45f9" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.371 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.371 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.371 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.372 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.372 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.373 183134 INFO nova.compute.manager [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Terminating instance#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.373 183134 DEBUG nova.compute.manager [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:29:53 np0005601977 kernel: tap2011cfc4-30 (unregistering): left promiscuous mode
Jan 30 04:29:53 np0005601977 NetworkManager[55565]: <info>  [1769765393.3982] device (tap2011cfc4-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.403 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:53Z|00178|binding|INFO|Releasing lport 2011cfc4-3053-450f-9a91-99928686bc26 from this chassis (sb_readonly=0)
Jan 30 04:29:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:53Z|00179|binding|INFO|Setting lport 2011cfc4-3053-450f-9a91-99928686bc26 down in Southbound
Jan 30 04:29:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:29:53Z|00180|binding|INFO|Removing iface tap2011cfc4-30 ovn-installed in OVS
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.407 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.410 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 30 04:29:53 np0005601977 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Consumed 12.781s CPU time.
Jan 30 04:29:53 np0005601977 systemd-machined[154431]: Machine qemu-11-instance-00000013 terminated.
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.453 183134 DEBUG nova.compute.manager [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-changed-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.453 183134 DEBUG nova.compute.manager [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing instance network info cache due to event network-changed-2011cfc4-3053-450f-9a91-99928686bc26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.453 183134 DEBUG oslo_concurrency.lockutils [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.453 183134 DEBUG oslo_concurrency.lockutils [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.454 183134 DEBUG nova.network.neutron [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Refreshing network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.616 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:c6:12 10.100.0.11'], port_security=['fa:16:3e:57:c6:12 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '65e07f9f-264b-4e0d-9aa7-f87ebaf84705', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34535701-9131-4137-9b04-abc5c4bde788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7d566d0-1254-4641-872b-bbe4cfbb0f9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbacebc3-4b49-42c7-a1e2-a9ffa8ce4adb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=2011cfc4-3053-450f-9a91-99928686bc26) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.617 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 2011cfc4-3053-450f-9a91-99928686bc26 in datapath 34535701-9131-4137-9b04-abc5c4bde788 unbound from our chassis#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.619 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34535701-9131-4137-9b04-abc5c4bde788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.620 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f05e1a76-b3a0-4cf0-9a4e-3d4c24df6e27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.621 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-34535701-9131-4137-9b04-abc5c4bde788 namespace which is not needed anymore#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.634 183134 INFO nova.virt.libvirt.driver [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance destroyed successfully.#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.635 183134 DEBUG nova.objects.instance [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [NOTICE]   (215771) : haproxy version is 2.8.14-c23fe91
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [NOTICE]   (215771) : path to executable is /usr/sbin/haproxy
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [WARNING]  (215771) : Exiting Master process...
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [WARNING]  (215771) : Exiting Master process...
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [ALERT]    (215771) : Current worker (215773) exited with code 143 (Terminated)
Jan 30 04:29:53 np0005601977 neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788[215767]: [WARNING]  (215771) : All workers exited. Exiting... (0)
Jan 30 04:29:53 np0005601977 systemd[1]: libpod-c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5.scope: Deactivated successfully.
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.725 183134 DEBUG nova.virt.libvirt.vif [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:28:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1268443133',display_name='tempest-TestNetworkBasicOps-server-1268443133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1268443133',id=19,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHLzlEd8wtlKHzkefK5E1BRlX6005o4Q+VYVdgk6Bo5nNG7FbaDriF0L4Ht8F7Rjf7lK+BlHNvbTNd1Pnjv98mpDpVPg4jQY5y/vCWLyhmCdni5A62CKqBzV5cIxNTWJvg==',key_name='tempest-TestNetworkBasicOps-552530718',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-im3rxvyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:09Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65e07f9f-264b-4e0d-9aa7-f87ebaf84705,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.725 183134 DEBUG nova.network.os_vif_util [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.726 183134 DEBUG nova.network.os_vif_util [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.726 183134 DEBUG os_vif [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:29:53 np0005601977 podman[216712]: 2026-01-30 09:29:53.727363133 +0000 UTC m=+0.038940673 container died c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.727 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2011cfc4-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.730 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.732 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.734 183134 INFO os_vif [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c6:12,bridge_name='br-int',has_traffic_filtering=True,id=2011cfc4-3053-450f-9a91-99928686bc26,network=Network(34535701-9131-4137-9b04-abc5c4bde788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2011cfc4-30')#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.735 183134 INFO nova.virt.libvirt.driver [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Deleting instance files /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705_del#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.735 183134 INFO nova.virt.libvirt.driver [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Deletion of /var/lib/nova/instances/65e07f9f-264b-4e0d-9aa7-f87ebaf84705_del complete#033[00m
Jan 30 04:29:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5-userdata-shm.mount: Deactivated successfully.
Jan 30 04:29:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay-a8fd4da329e6c7463504556dda0aa5cf88dab6757f711f19e721760d8db4d57b-merged.mount: Deactivated successfully.
Jan 30 04:29:53 np0005601977 podman[216712]: 2026-01-30 09:29:53.754507369 +0000 UTC m=+0.066084899 container cleanup c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:29:53 np0005601977 systemd[1]: libpod-conmon-c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5.scope: Deactivated successfully.
Jan 30 04:29:53 np0005601977 podman[216740]: 2026-01-30 09:29:53.800627787 +0000 UTC m=+0.032974154 container remove c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.803 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f67d7f91-6f6f-4533-ad99-21676c67085a]: (4, ('Fri Jan 30 09:29:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788 (c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5)\nc7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5\nFri Jan 30 09:29:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-34535701-9131-4137-9b04-abc5c4bde788 (c7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5)\nc7473c8f317595bf1601a8cb90ef91fbf3956ac0fae4dc7ead94d3e235102ef5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.804 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[983324f0-052b-4360-abef-7ebcd4d8088c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.805 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34535701-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.807 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 kernel: tap34535701-90: left promiscuous mode
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.814 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.818 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[22c65f3d-b60b-48ce-a47a-aec265568bb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.835 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecd8998-0f66-4f71-9845-a5d663cda7f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.837 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8548fdd2-aa86-4329-bf7a-5859df194404]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.849 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bb418c86-3644-4f99-a609-f29f656afd76]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387800, 'reachable_time': 18839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216757, 'error': None, 'target': 'ovnmeta-34535701-9131-4137-9b04-abc5c4bde788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.850 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-34535701-9131-4137-9b04-abc5c4bde788 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:29:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:53.850 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[581d831c-be3f-41cf-b777-89222538342d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:29:53 np0005601977 systemd[1]: run-netns-ovnmeta\x2d34535701\x2d9131\x2d4137\x2d9b04\x2dabc5c4bde788.mount: Deactivated successfully.
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.908 183134 INFO nova.compute.manager [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.908 183134 DEBUG oslo.service.loopingcall [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.909 183134 DEBUG nova.compute.manager [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:29:53 np0005601977 nova_compute[183130]: 2026-01-30 09:29:53.909 183134 DEBUG nova.network.neutron [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:29:54 np0005601977 podman[216758]: 2026-01-30 09:29:54.859193711 +0000 UTC m=+0.079424320 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 30 04:29:55 np0005601977 nova_compute[183130]: 2026-01-30 09:29:55.267 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.449 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7a073e24-c800-4962-af5e-ff5400800f34', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'hostId': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.451 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'hostId': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '396e2944b44f42e59b102db87e2e060c', 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'hostId': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '362db3ae-3984-411e-994b-55924dc0c06f', 'name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000015', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'hostId': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.458 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.461 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 22bc0323-ee7d-4b6e-992e-a2410bf240e6 / tape680749e-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.461 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.463 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 93629e5c-ca92-47ac-8567-35d85b4e2a73 / tap695209cb-0d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.463 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.465 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 362db3ae-3984-411e-994b-55924dc0c06f / tap4433db17-a6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.465 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18bbecad-1d0e-4476-912a-9585b42dce90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.455542', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3cd3664e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': 'fd802ce57e7cb2d1c20512b01d00afc6fa9cc531b74dff7a9051bfbbc6a06cbb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.455542', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3cd3d50c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': 'c36b05234a4110645ec63d0c460021174070b818fc6b41e7fecd620d8f0259a3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.455542', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3cd4266a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '94ff59df38f1ed715e917f75482bf4af4e4fc68523e58b841cee08e65f92cd03'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.455542', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3cd47322-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': '567572807865b30f48931daa9e392cf90f73d913c6329b7fb12686b7da5817b9'}]}, 'timestamp': '2026-01-30 09:29:55.465628', '_unique_id': 'f52c70ca4b5d476ba1ac91148cd36bcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.467 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.468 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.468 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.468 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11a31bf8-0925-48e2-ae74-95addcbaab06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.467796', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3cd4d24a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': '203fc7c6f66f0649f140544c4fda6f62444a2e6ddc108f23dacb617d38936d3b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.467796', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3cd4dbc8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': '8b87c0c43d9cf03fd505c80dfbb25d946318d5c2928f128987968400f5d6fdd3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 45, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.467796', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3cd4e7c6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '26719b87e85db2b1264a42c9e780bf045997af612c23d883e9d5628d8d5816c6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.467796', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3cd4efaa-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'f29410fa4a06fedb35a1cd8263d4be76eedf358bf59dd4a79a76c1287e5318ce'}]}, 'timestamp': '2026-01-30 09:29:55.468790', '_unique_id': '1d19d49c418e454ea2fd20010232b7b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.470 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.470 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.470 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb1da2a5-21e7-478b-8414-fb270e98f71b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.470142', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3cd530fa-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': '6ac8855feb483f28accac90b3069d8a813a52f2227b60f52c54ace5db307e3fe'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.470142', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3cd53bd6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': 'a88b8380170163195ff624b704e599ecdca62397de2441ced389311407e1b43a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.470142', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3cd54626-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': 'bcd111a007be19cabb57a6c435d32a1a8864342c823c77b378c5b3159ed1d743'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.470142', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3cd553fa-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': '01b64e6acee47163c42f0d693f69d1099368052e97a916f9b8b82cb361ca40d0'}]}, 'timestamp': '2026-01-30 09:29:55.471438', '_unique_id': '888cc865fc0d4e8b9654086513f3301e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.495 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.495 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.533 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.latency volume: 1087370 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.534 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.566 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.latency volume: 2076628431 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.566 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.596 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.latency volume: 2969574095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.597 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8d8c714-68ea-41c1-bd7d-14e0f5f8c641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cd909dc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '95b018734446704367ee120ab4c66713d4a7b3fc6933d3a04dee9787fc1568ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cd91c24-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '86287b1d43224ed947fae86f288f6dcc982f9bc5452b1c74f6c39ffccedd2a0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1087370, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cdef37e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '4b3468ae17257b3d4b09e0dc3d423cbb52b612a7c7819996508f574097c8aafd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cdf0044-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': 'de74e82e74c1a4bdc9b81685f5fb6781853b9aedfdf9e347f189362b6762bb0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2076628431, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce3dcae-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '0650b8cc66827677f78a72791cc8334c27117ccc0757f70fa6fcff7d3a91f124'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: age_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce3e686-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '77f99fceda3a564810cdcc93533a350872455617eaabcd471e33f7af6b55891c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2969574095, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce88d9e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': 'ea4242e65f5553d40d9655d040a65ba44931e2dad5714129b6c394ff67e38a06'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.472653', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce89a5a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '71f230f61a34b72b73b681fd4a15ce876f1c10689fb0edb6fdea799b3360c933'}]}, 'timestamp': '2026-01-30 09:29:55.597728', '_unique_id': 'e7d105f8cb0a4bf185b2bb7cfcf60d3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.599 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes.delta volume: 378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.600 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.600 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.600 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05696de0-4c08-4824-88ca-a1c44dffcb48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 378, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.599847', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3ce8f9a0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': 'a590676f6bd589117070e050ef160fbf4141e28258a8287b8fb5a1700868c472'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.599847', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3ce905d0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': '039aae71d38196f5127a08d11ba0c5b59875d4470cf43c460e4d7e9ab7583fc8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.599847', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3ce910b6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '27703df27d9fc2f9da3e4de65a80b70eda78fd54e5177190e8a2d8e0b5fb466b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.599847', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3ce91b56-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'cff73af3cc13ba327da387d97abd87881df33bb1bffc406758bd9f394db894ca'}]}, 'timestamp': '2026-01-30 09:29:55.601023', '_unique_id': 'faf783cc09ee4a439e536806a88cc9ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.602 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.602 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.604 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.bytes volume: 30276096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.604 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.604 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.bytes volume: 30640640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.604 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c89cf675-013a-4ae0-891e-8bd90cb576dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce97740-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '113e99476d25433af93c163e1d1fc15abe67c8f6e947e25574982639171a846a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce98226-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '88cf3e90c825abf8474a28696f8f2092c97c1b35ad1c01b124a9cbb5b31d1551'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce98c4e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '034bc2920f5db56bc434c416dcb616407642243d31a03d058eb89fb8fcb6b869'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce9961c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': 'cf7da984e730f2a6152b79d4a816355f40491789b5b4bd576f17fdf98ced8788'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30276096, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce9a12a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': 'e8431643d378346903e6bb0c45f8ba6b2687508ea1d3d87590b49d8992155b8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d2
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: ecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce9aaf8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '3e74b9fc8292b43f25419226f156e8bd1d3890cabfd00a29edd15fbc3fe0509b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30640640, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ce9b4b2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '2afe49016349a675c91f2cfe38ceb7e0fc8aac688248bfa4e7aff8f73bc76428'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.603045', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3ce9be4e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': 'f00ef62f66f1ac705f96a1f3c9dba1587c1db859a3a7361395f090e6c2ad8df9'}]}, 'timestamp': '2026-01-30 09:29:55.605197', '_unique_id': 'a9d3f31878c94ef3bfaa0854dc5d5c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.606 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.606 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.607 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.607 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.607 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.607 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.latency volume: 728377080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.608 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.latency volume: 53741683 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.608 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.latency volume: 455680865 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.608 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.latency volume: 43745692 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '829df8cf-95ac-4f9d-888f-6379ea1fe7c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cea0afc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '9d9485764bb3e22754b89a1ebc7feb42f6648560543141bfb51dd14cb6fa1034'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cea17d6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '881f7885685fe5628c041530848573b3d8f557b49273ec604cbace1f2d0c35ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cea21f4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '4928d79f50e4fd6cabb34bfbaa65c29db7b0e59c46dc967fe8861eb4ce380d32'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cea2bb8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '816079b06a7333c26ca999e2c2f8ad38247e4ff5bcac31120cb3630e917a3076'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 728377080, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cea357c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '7b6e9ab0872b8780707c1f80e39c8573fec6ac2a4e70bc148c0c97922a3c51f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53741683, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf6
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: f_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cea4044-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '6dfbe196702bbe3313dfdf781ca8e143d9fa94cdea32fd33a8b7c3611249b7fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 455680865, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cea4a1c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '1ae54e0166eba9097a8a031a454b6cdfa8dcd657d08334a1ed67b0078f3d22a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43745692, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.606857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cea53cc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '30bc4a460ea13cb3e481c0c716c103e9c69736005a86d53cd01ea7f6068d4fd7'}]}, 'timestamp': '2026-01-30 09:29:55.609007', '_unique_id': '601a331722984f8fa2b2b87a8b87df90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.610 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.623 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.624 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.635 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.636 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.645 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.645 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.658 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.659 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9779e5ff-53e3-4eb1-a8cf-31127ca0be64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ceca4b0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': '5c40cc43499abf31ecd7738766a1e51ea93563b2d829793e687ccd33e63a299c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cecb37e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': 'c58a9350e07faf7afa5630acace34316b81a8c8c03eff56378331b3a33be692e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cee6ea8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': '099b42e758591c33e75f9c80dc20de55ab6f29536e7952d82f6c70c0921d63a3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cee8474-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': 'fe39e24412495188d46e01d8f2896be75e3436663a273c1c4df056b6274115d0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3ceff2d2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': 'eb4666a056cfc03e1f4d17d1b3a767d39b0c37831f61088c6391d9fcd6510492'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'im
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cf006be-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': '98c7d4579a88434d60c42112a4f87cfb525323c8d57055d5b75b8d837370800e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cf1f23a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': 'b8db0da621ca1da4d46bf855df3b8ef598335089746b6c889065ba477e4dad61'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.610606', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cf20c0c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': 'a47dfdeae4ea18513f3f7ef4a0fa8b9fb1f62ee8c07e6a7f761c77e5ca6d6f9d'}]}, 'timestamp': '2026-01-30 09:29:55.659724', '_unique_id': '600400f84e94467dabfa984170d04d26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.663 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.664 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.664 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.665 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.665 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.666 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0469c738-6bd0-4e80-bd12-0e5a8c2c2710', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.664512', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3cf2dc86-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': '2add251891362933d509626e383db4d10905a07cf6c00b257ab00d7271398472'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.664512', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3cf2f234-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': 'd45f829c28e7ef14f051ddd0307fa15fc91001b1fb07298905549687d49b29a2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.664512', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3cf303e6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': 'a20339590823003d30b1cd8dd234ba80259cce3c3639af5c1f6b82bfd699bffb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.664512', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3cf31764-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': '13183baacbee8aab4b6e5892b185d7671d03d5c88d583d967abd097a07ff4258'}]}, 'timestamp': '2026-01-30 09:29:55.666539', '_unique_id': 'e6e43b362d064fdfafaf4b0b99e0c1df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.668 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.682 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/memory.usage volume: 42.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.700 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/memory.usage volume: 42.84765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.718 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/memory.usage volume: 46.7734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.738 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/memory.usage volume: 40.44921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b25fc87f-c933-4e39-a35f-5f6b69deca5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4140625, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:29:55.669049', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3cf596ce-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.077145978, 'message_signature': 'bb46ed6460f07a76a569c838977f141fad4a38a6d24881bc22f6625015cc6176'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.84765625, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'timestamp': '2026-01-30T09:29:55.669049', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3cf867f0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.095606975, 'message_signature': '4262c42fb70da96505b34034695ed645eb0abfcbdcb1f4e43284470a0a285cfd'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.7734375, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'timestamp': '2026-01-30T09:29:55.669049', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3cfb119e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.113127186, 'message_signature': '93b3f1441232c32ca202d91120820766d7767e632bfa9ae7f80baa1184ffe2af'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44921875, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'timestamp': '2026-01-30T09:29:55.669049', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3cfe2ce4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.133465397, 'message_signature': '26376eeeecf8846fdcec761a32158e7edf8ae7fcf03eaed885a20f4d1979c8dc'}]}, 'timestamp': '2026-01-30 09:29:55.739473', '_unique_id': 'dd6f3fcc54804afbb832e1cea2382a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.742 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.743 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.743 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.744 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.744 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.745 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.745 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.746 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.746 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2e50ef4-639b-4f2f-a50b-58e1d7b91e31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cfed8c4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': 'b5026cf941c95828eec214c7e243a53556a128ff9c42df9f58eea83a24561e12'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cfeefe4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': '725c3f6372280b35862da00b9296045b9f246547fbb2282868544799ee60d42e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cff0326-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': 'b5fa6aea1d1f5613e58f4a6328235539bca8f4713c855ebf6d4733748a2f1fd6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cff1988-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': '1fac6cd7c7c18c77d4d0ba9ea3b37c5dda7f7e834d12ba1d8b8df6c1038dba14'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cff2d42-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': '33f340d655dc195ebc65d530d0ec2ed598650b98c8f58499453d8372cabaee98'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: pus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cff4340-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': 'f9097eda8982b3f5af2fce26a741b24308bd7186f5dae47d9868d2abc567b69d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cff5614-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': '0cad6d15e4cb910faf335d2107c572087ac6c46d65d0e63b9edea3a942a40c34'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.742990', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3cff6c8a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': 'c3ed7714d2bdf7df6e75baee2d0588588c91db8e85a6eff640b3f459bc57710d'}]}, 'timestamp': '2026-01-30 09:29:55.747392', '_unique_id': '0e23f5cb727d46eba253e8b017234425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.750 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.751 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.751 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.requests volume: 2 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.752 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.752 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.754 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17f078c7-864d-48bc-9c81-224233ccbfb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3cfffe48-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '73c231a5ed728c3762c7ecd193fb93165c9bcb4e4cea3af3e7f7b2f07b004017'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d001856-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '48cd8121d7fa72fe4200a41bf0657551873a8f573208228b194379cb9a9a7cef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 2, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d002896-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': 'c7c74c1d605c7211d874b21c4643483eac2361a664fc41a16d361ae5439cffb1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d004056-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '0dd650d8cd9c6dd233833ee4a6ba3c05f3beb2779a00dd9047f942373a7a8302'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d005064-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '9e934116ed821e87297e7287cde43a6bafa04cf39039a16ba9161ff1cda387e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state'
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d00691e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '368a1778d68e8b0fddc3c546fb68847d1b995b3b403f597acbd63a3fef237c29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d007936-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '4029bdbe091e9f8788300dc05d06879200f5575f7f4503454387285b93f6083c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.750618', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d008f98-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '0b2c1403d220d745d47c7133387282d9c23a4cc2ee6f522835e2124f783901f7'}]}, 'timestamp': '2026-01-30 09:29:55.754798', '_unique_id': '82e742e7aaa14ef4bac9073c9633654b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.757 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.758 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/cpu volume: 1240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.758 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/cpu volume: 100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.759 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/cpu volume: 11300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.759 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/cpu volume: 10570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1f6201f-71ba-4f00-b548-699fca05fcbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1240000000, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'timestamp': '2026-01-30T09:29:55.758015', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3d011fb2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.077145978, 'message_signature': '0b0944c29023d965be9db7729df6b73c9091dbe17318a7e75481e82116fdc08b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 100000000, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'timestamp': '2026-01-30T09:29:55.758015', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3d01345c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.095606975, 'message_signature': '4ddccf6accf6a43099e0de364f592e7469611fbb36eee1ed63d756540c285556'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11300000000, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'timestamp': '2026-01-30T09:29:55.758015', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3d014640-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.113127186, 'message_signature': '20726cfe46b43e9fd80408254ca594945336e6bcf6ad7fa61d4254b1411f465a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10570000000, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'timestamp': '2026-01-30T09:29:55.758015', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3d01591e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.133465397, 'message_signature': 'e146c3fe78ad79a811e723f582abdca38f1379d5fafedc80db83958431b8440b'}]}, 'timestamp': '2026-01-30 09:29:55.760010', '_unique_id': '389ee26143074f30a73139da083a8cc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.761 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.763 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.763 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.763 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.764 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.764 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.765 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.765 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.766 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.766 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.767 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.requests volume: 1096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.767 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10381fc3-175a-4976-b7ce-7500939da55d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d020b52-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': 'fb819a66fdee5b9aea2381c8d19ebeebf93f8d24809387397384a900641d26df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d021c82-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '2cfa3fd258f6f7d5b963be383bc972b0109e3e32d6bba2e3c6a3b29f007dd1dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d0233b6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '248f04141b1c61178074eaa5ef2da3333fd32ce085060fc27132cc7911906216'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d0245ae-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '7d5887e0fc9a8b7f4b595b5ee9b5b82438b7facdea59a41f0eb64e38656191f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d025c10-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': 'ad45735a9ae30f0fa85eb1dc6ea3da60d0f90b48e964f272da6934bfdf3f0b4c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: d-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d026be2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '8dd9988098604cc388e3b96f31f76f7dfe4aeb3696397eee3ce69ebcb160b83b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1096, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d0283c0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '7969f7cc95706bf6b9d78e19be2168e25b00d4de7ba5b4126afcf9d5301c291a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.764016', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d029356-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '4221a10ece6dbc762ece6848156a79d699495104930b1f38cef8fcd66624b769'}]}, 'timestamp': '2026-01-30 09:29:55.768104', '_unique_id': '34186bb8728c435685b10da032f92c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.771 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.771 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.772 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.772 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82174cfc-a27c-4e40-ae7c-930c3c1bd982', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.770999', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3d032366-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': 'd16c11267f228d60ef84499afd5c524ed4def8888ab91b860a03027256c3bd68'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.770999', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3d0335e0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': '2b35151b9b8d338e7761bf37b6e644d02f50c3462b2b2bbfb60e9912b3f977be'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.770999', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3d034422-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': 'bf50fce7b173025bcf3b9ff23a5badf7ba53afd79eba3d0812721dd26074fcce'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.770999', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3d034ecc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'e032188c5bdf0d00be4251514ee2a6447ed7062f2e6886d5aed2eb9425b1dda3'}]}, 'timestamp': '2026-01-30 09:29:55.772762', '_unique_id': 'df29aab7dd774561870fdef45143097f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.774 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.incoming.bytes volume: 1168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.775 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.incoming.bytes volume: 710 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.775 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.bytes volume: 7284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.776 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.incoming.bytes volume: 1842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14321182-21c0-4d99-b6f4-b942c17493b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1168, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.774807', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3d03ad36-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': '8d054331a938c36d7b125aaf9c85ae591791d528ad1475c83c03e5258dbc8c62'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 710, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.774807', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3d03c050-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': 'c029a8d9a48dc721e6e486c1b133781af3c88390f83614324b8f295ce7eb1372'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7284, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.774807', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3d03ce60-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '93901363409708d03716e5b3b2fd0af13e988e04c971b3167de3cad102501579'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1842, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.774807', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3d03ddd8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'affc30a6463228eaabe0922a6f7cc056dedbb27043a97c7726239d957d40e69f'}]}, 'timestamp': '2026-01-30 09:29:55.776523', '_unique_id': 'f9c2c97180e94e65b4005f12bc1ed118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.778 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.779 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.779 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.bytes volume: 12288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.780 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.780 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.781 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.781 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.bytes volume: 72908800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.781 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9936c6f-b5c1-415a-8617-114f0ab9d2e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d04525e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': '529ed5702118e98c4776996a1549d2648fb89f70605e2e53b51207768a13047e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d0463fc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.867537659, 'message_signature': 'd20257b52c625a827b16866d25c2e3005abd670f2ba43a61ef64f613bc0754e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12288, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d04723e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '783098011e1acca963c799ab500a7ed26c17396c3806f9800ea437a36f178af6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d0483aa-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.891266207, 'message_signature': '0ee1eaa54f732baab5e258c17a80e00c7f74967a5d93fc081581c3aeab5be673'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d049250-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': '0f543bcfa9a8a6b7dc92f22eb31f74e25397b0b18f951a975aa05e28c5aba79f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83f
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: rchitecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d04a182-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.929722906, 'message_signature': 'd78c3becd2d990b5ee414c5f90dec49d31073ec5978c1d75c224b84fb106eb7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72908800, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d04b280-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': 'afa1e036e62894ae23310323b9fd37f864d58ce2b793f2106133932cbf24fe34'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.778944', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d04c09a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.961737531, 'message_signature': '7381c5380f61aa2c30c45f7946a04ca8ef1b81e5cb91a28b820bda701f9e710e'}]}, 'timestamp': '2026-01-30 09:29:55.782278', '_unique_id': '47e69c791094466994f90589ff0cc51d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.598 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.784 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 30617600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.784 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.785 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.785 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.785 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.786 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.786 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.786 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d82ebe-b92b-4d36-877f-0d3e9d9dc362', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30617600, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-vda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d0520b2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': 'e7e18a0424723f0ea3fdbf3ef4295a5c136a550b9aad89dcb6058f3b4f703273'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '7a073e24-c800-4962-af5e-ff5400800f34-sda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'instance-00000006', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d05341c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.005511371, 'message_signature': 'bcc59ca45b5cd7684f12d3dc66778acfbb425a43f3f368a82febf4561e21eb78'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-vda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d053f52-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': '21987a2cb4cf131a964add9198f388ed098f6243d89e06474d3c7e704ce86b11'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6-sda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'instance-00000010', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d054bf0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.0194675, 'message_signature': 'f9975fbb687e6206a63ef30cf77b0ff6c139618faae6eb9cef18c9cad2f8c307'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d055988-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': 'f4d1cae934671c3b5e1ea72ff05c885e7ef3fa39139a561dcca3260786d0cad1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d056590-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.031442272, 'message_signature': '74389bf5d4a09ca3fc5cb98f6cff92bca8c08ad372a5df883c434026968401b3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-vda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3d0570a8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': 'f07d883678c961ac645ac6b43f19343f41150e1025810d152d1139672f7b97bd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '362db3ae-3984-411e-994b-55924dc0c06f-sda', 'timestamp': '2026-01-30T09:29:55.784369', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'instance-00000015', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3d057e04-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3927.041335025, 'message_signature': '385910ab892e2751250af6f283f780836285766a6b4fbe663c77d7b57631007c'}]}, 'timestamp': '2026-01-30 09:29:55.787077', '_unique_id': '83ac60bec92a4b21bc24770f6b2fe581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.788 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.789 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.789 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.789 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc30705-b3e6-4cee-a73e-5d30e856ca96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.788734', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3d05d08e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': 'c3a2d9da9d1ef6ef0fa77cecf12de2c4fe850516ada6cddf8411150ecf13ac9d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.788734', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3d05dd22-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': '3484a2d7092b7f866b46d88c79632a3dfe4d9d5ed8b4ee54a559552d9c33af08'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.788734', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3d05e98e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': 'e71f01fb74dd2f5eff866e6aaab26ac47744857abc21d0670627e771fdd2e308'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.788734', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3d05f7bc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'e3a850b4742b195fc9e8e5d32bb90098a8c73e2f3ec2bdc60c324d187ded0ce2'}]}, 'timestamp': '2026-01-30 09:29:55.790200', '_unique_id': '723103b78b4346a592944022a1ffa26d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.791 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.792 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.outgoing.bytes volume: 1460 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.792 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.bytes volume: 5882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.792 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4beff830-217c-4a20-ba84-3307a0a23ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.791806', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3d064910-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': 'f98708e5a4aed297e2a1a4e0a1413c03d565d26818b6d3e28da036d8aff7349e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1460, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.791806', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3d065658-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': 'e113759ebda8dfbc5666653006168950e1f9689f2cbcaefda168b4a3ba4af3d9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5882, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.791806', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3d0662c4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '8c84691ecd84a220a26f1dfbd37c72ea392f991f336782fb3fd2f01ac79e1e98'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.791806', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3d0671ba-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'd9ec3f119bd4e49d3aa98a79b551e330ec2ac420b7665e994f552763de01884c'}]}, 'timestamp': '2026-01-30 09:29:55.793318', '_unique_id': 'a3b24a02c9a844b4af9172e47408b755'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.794 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.795 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-481398456>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-149082075>]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.795 12 DEBUG ceilometer.compute.pollsters [-] 7a073e24-c800-4962-af5e-ff5400800f34/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.795 12 DEBUG ceilometer.compute.pollsters [-] 22bc0323-ee7d-4b6e-992e-a2410bf240e6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.796 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.796 12 DEBUG ceilometer.compute.pollsters [-] 362db3ae-3984-411e-994b-55924dc0c06f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d3552d9-c4fa-4753-b619-4939c77ee144', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000006-7a073e24-c800-4962-af5e-ff5400800f34-tapfb902761-f0', 'timestamp': '2026-01-30T09:29:55.795552', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1403442336', 'name': 'tapfb902761-f0', 'instance_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9b:52:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb902761-f0'}, 'message_id': '3d06d7c2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.8504156, 'message_signature': '17a67798d8e0bdbb55377f4b0ca5a5b116c93364bbe1d325e075680b82d74f66'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3fd4ee63e94e4c3b9a3e4cefa7e0f626', 'user_name': None, 'project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'project_name': None, 'resource_id': 'instance-00000010-22bc0323-ee7d-4b6e-992e-a2410bf240e6-tape680749e-01', 'timestamp': '2026-01-30T09:29:55.795552', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-481398456', 'name': 'tape680749e-01', 'instance_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'instance_type': 'm1.nano', 'host': '7e6181ddafc0ea2f49a356c984b911950c8c246fa83e2d5c5d8a444e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:37:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape680749e-01'}, 'message_id': '3d06e6cc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.853697294, 'message_signature': '97d56bb01894d655648607c8196d77e3a85cbec5459e53c6887be43a379b326e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:29:55.795552', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '3d06f536-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.856412922, 'message_signature': '152f5447aab9a083aca5ac787a012e4b3ce307b47799287d064c6fa5e7c67483'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000015-362db3ae-3984-411e-994b-55924dc0c06f-tap4433db17-a6', 'timestamp': '2026-01-30T09:29:55.795552', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-149082075', 'name': 'tap4433db17-a6', 'instance_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'instance_type': 'm1.nano', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:24:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4433db17-a6'}, 'message_id': '3d0700e4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 3926.858492991, 'message_signature': 'ba7b69fd8b557fe1b4a7cf3d0c06be1d125dba897b964312242c1119bef91836'}]}, 'timestamp': '2026-01-30 09:29:55.796971', '_unique_id': 'e12f4f75be8b4a7992ca78dcd3a879ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.605 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:29:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:29:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.609 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.661 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.748 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.756 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.769 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.783 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:29:55.787 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.416 183134 DEBUG nova.compute.manager [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-unplugged-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.417 183134 DEBUG oslo_concurrency.lockutils [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.417 183134 DEBUG oslo_concurrency.lockutils [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.417 183134 DEBUG oslo_concurrency.lockutils [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.418 183134 DEBUG nova.compute.manager [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-unplugged-2011cfc4-3053-450f-9a91-99928686bc26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.418 183134 DEBUG nova.compute.manager [req-d6502b9a-d2fc-4b79-8a8d-069c64ef6b37 req-6468e5e1-7e42-4318-92ac-be264943b919 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-unplugged-2011cfc4-3053-450f-9a91-99928686bc26 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.440 183134 DEBUG nova.network.neutron [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.462 183134 INFO nova.compute.manager [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Took 2.55 seconds to deallocate network for instance.#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.519 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.519 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.540 183134 INFO nova.compute.manager [None req-5cacd825-852c-4aa9-994f-e5eedf83708c 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Get console output#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.548 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.634 183134 DEBUG nova.network.neutron [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updated VIF entry in instance network info cache for port 2011cfc4-3053-450f-9a91-99928686bc26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.634 183134 DEBUG nova.network.neutron [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Updating instance_info_cache with network_info: [{"id": "2011cfc4-3053-450f-9a91-99928686bc26", "address": "fa:16:3e:57:c6:12", "network": {"id": "34535701-9131-4137-9b04-abc5c4bde788", "bridge": "br-int", "label": "tempest-network-smoke--1537365840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2011cfc4-30", "ovs_interfaceid": "2011cfc4-3053-450f-9a91-99928686bc26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.667 183134 DEBUG oslo_concurrency.lockutils [req-31a04263-8195-415a-9487-945a0d22d984 req-4a21c051-4b37-4067-8e66-711f65e6e5d9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-65e07f9f-264b-4e0d-9aa7-f87ebaf84705" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.721 183134 DEBUG nova.compute.provider_tree [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.741 183134 DEBUG nova.scheduler.client.report [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.768 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.813 183134 INFO nova.scheduler.client.report [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705#033[00m
Jan 30 04:29:56 np0005601977 nova_compute[183130]: 2026-01-30 09:29:56.873 183134 DEBUG oslo_concurrency.lockutils [None req-efd77114-2e23-4ac1-abfd-0220c2dfda3a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.056 183134 INFO nova.compute.manager [None req-e476cd8c-3f70-4726-aba1-de187a7adf4b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Pausing#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.057 183134 DEBUG nova.objects.instance [None req-e476cd8c-3f70-4726-aba1-de187a7adf4b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'flavor' on Instance uuid 362db3ae-3984-411e-994b-55924dc0c06f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.087 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765397.0869, 362db3ae-3984-411e-994b-55924dc0c06f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.087 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.089 183134 DEBUG nova.compute.manager [None req-e476cd8c-3f70-4726-aba1-de187a7adf4b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.116 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.120 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:29:57 np0005601977 nova_compute[183130]: 2026-01-30 09:29:57.148 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 30 04:29:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:57.382 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:57.382 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:29:57.383 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.731 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.868 183134 DEBUG nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.869 183134 DEBUG oslo_concurrency.lockutils [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.869 183134 DEBUG oslo_concurrency.lockutils [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.869 183134 DEBUG oslo_concurrency.lockutils [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65e07f9f-264b-4e0d-9aa7-f87ebaf84705-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.870 183134 DEBUG nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] No waiting events found dispatching network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.870 183134 WARNING nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received unexpected event network-vif-plugged-2011cfc4-3053-450f-9a91-99928686bc26 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.871 183134 DEBUG nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Received event network-vif-deleted-2011cfc4-3053-450f-9a91-99928686bc26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.871 183134 INFO nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Neutron deleted interface 2011cfc4-3053-450f-9a91-99928686bc26; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.871 183134 DEBUG nova.network.neutron [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 30 04:29:58 np0005601977 nova_compute[183130]: 2026-01-30 09:29:58.874 183134 DEBUG nova.compute.manager [req-478ec084-5f16-4a3f-beab-8bef526ad5cb req-3964c6bc-eef6-4030-9218-b0a52bb4ab70 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Detach interface failed, port_id=2011cfc4-3053-450f-9a91-99928686bc26, reason: Instance 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.362 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.362 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.363 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.363 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.465 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.517 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.518 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.592 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.601 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.661 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.663 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.708 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.716 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.759 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.759 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.802 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.808 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.860 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.861 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:29:59 np0005601977 nova_compute[183130]: 2026-01-30 09:29:59.919 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.127 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.128 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5036MB free_disk=73.24380493164062GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.129 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.129 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.269 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.276 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Migration for instance 22bc0323-ee7d-4b6e-992e-a2410bf240e6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.299 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Updating resource usage from migration 438d6bf5-4644-42e3-9c39-c5003fe539c9#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.299 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Starting to track incoming migration 438d6bf5-4644-42e3-9c39-c5003fe539c9 with flavor 43faf4bc-65eb-437f-b3dc-707ebe898840 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.359 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.359 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 93629e5c-ca92-47ac-8567-35d85b4e2a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.359 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 362db3ae-3984-411e-994b-55924dc0c06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.385 183134 WARNING nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 22bc0323-ee7d-4b6e-992e-a2410bf240e6 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.385 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.385 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.510 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.563 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.625 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.625 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.869 183134 INFO nova.compute.manager [None req-e80578fd-2267-4f58-b73b-fd914882413e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Get console output#033[00m
Jan 30 04:30:00 np0005601977 nova_compute[183130]: 2026-01-30 09:30:00.873 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.117 183134 INFO nova.compute.manager [None req-256a7690-2cd2-4320-a4a8-4b1e061308ca 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Unpausing#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.118 183134 DEBUG nova.objects.instance [None req-256a7690-2cd2-4320-a4a8-4b1e061308ca 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'flavor' on Instance uuid 362db3ae-3984-411e-994b-55924dc0c06f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.160 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765401.1602118, 362db3ae-3984-411e-994b-55924dc0c06f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.161 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:30:01 np0005601977 virtqemud[182587]: argument unsupported: QEMU guest agent is not configured
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.165 183134 DEBUG nova.virt.libvirt.guest [None req-256a7690-2cd2-4320-a4a8-4b1e061308ca 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.165 183134 DEBUG nova.compute.manager [None req-256a7690-2cd2-4320-a4a8-4b1e061308ca 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.240 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.243 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:30:01 np0005601977 nova_compute[183130]: 2026-01-30 09:30:01.285 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 30 04:30:02 np0005601977 nova_compute[183130]: 2026-01-30 09:30:02.361 183134 INFO nova.compute.manager [None req-a2af38ea-8612-4680-bab7-d838a81597eb 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Get console output#033[00m
Jan 30 04:30:02 np0005601977 nova_compute[183130]: 2026-01-30 09:30:02.366 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:30:03 np0005601977 nova_compute[183130]: 2026-01-30 09:30:03.733 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:04 np0005601977 nova_compute[183130]: 2026-01-30 09:30:04.625 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:04 np0005601977 nova_compute[183130]: 2026-01-30 09:30:04.626 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:04 np0005601977 podman[216811]: 2026-01-30 09:30:04.846014659 +0000 UTC m=+0.061857888 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:30:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:04Z|00181|binding|INFO|Releasing lport 66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b from this chassis (sb_readonly=0)
Jan 30 04:30:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:04Z|00182|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:30:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:04Z|00183|binding|INFO|Releasing lport 34137aa8-3b0b-4b19-b520-be2930318935 from this chassis (sb_readonly=0)
Jan 30 04:30:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:04Z|00184|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.031 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.170 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.171 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.171 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.172 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.172 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.174 183134 INFO nova.compute.manager [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Terminating instance#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.176 183134 DEBUG nova.compute.manager [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:30:05 np0005601977 kernel: tap4433db17-a6 (unregistering): left promiscuous mode
Jan 30 04:30:05 np0005601977 NetworkManager[55565]: <info>  [1769765405.2024] device (tap4433db17-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:30:05 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:05Z|00185|binding|INFO|Releasing lport 4433db17-a607-4a44-9251-c5e602dc0576 from this chassis (sb_readonly=0)
Jan 30 04:30:05 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:05Z|00186|binding|INFO|Setting lport 4433db17-a607-4a44-9251-c5e602dc0576 down in Southbound
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.214 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:05Z|00187|binding|INFO|Removing iface tap4433db17-a6 ovn-installed in OVS
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.220 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.225 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.230 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:24:1c 10.100.0.14'], port_security=['fa:16:3e:1c:24:1c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '362db3ae-3984-411e-994b-55924dc0c06f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60eba810-de66-4c2e-8c3c-70333d77e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9d31663-60dd-457d-986c-66184f7449fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e93af628-2f8e-4aae-a24f-d949db135cfb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4433db17-a607-4a44-9251-c5e602dc0576) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.231 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4433db17-a607-4a44-9251-c5e602dc0576 in datapath 60eba810-de66-4c2e-8c3c-70333d77e79c unbound from our chassis#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.233 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60eba810-de66-4c2e-8c3c-70333d77e79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.234 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f500db74-c972-48a9-97be-daaba17adb27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.235 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c namespace which is not needed anymore#033[00m
Jan 30 04:30:05 np0005601977 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 30 04:30:05 np0005601977 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000015.scope: Consumed 11.852s CPU time.
Jan 30 04:30:05 np0005601977 systemd-machined[154431]: Machine qemu-13-instance-00000015 terminated.
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.273 183134 DEBUG nova.compute.manager [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-changed-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.273 183134 DEBUG nova.compute.manager [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing instance network info cache due to event network-changed-4433db17-a607-4a44-9251-c5e602dc0576. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.274 183134 DEBUG oslo_concurrency.lockutils [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.274 183134 DEBUG oslo_concurrency.lockutils [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.274 183134 DEBUG nova.network.neutron [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Refreshing network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [NOTICE]   (216234) : haproxy version is 2.8.14-c23fe91
Jan 30 04:30:05 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [NOTICE]   (216234) : path to executable is /usr/sbin/haproxy
Jan 30 04:30:05 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [WARNING]  (216234) : Exiting Master process...
Jan 30 04:30:05 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [ALERT]    (216234) : Current worker (216236) exited with code 143 (Terminated)
Jan 30 04:30:05 np0005601977 neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c[216230]: [WARNING]  (216234) : All workers exited. Exiting... (0)
Jan 30 04:30:05 np0005601977 systemd[1]: libpod-0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4.scope: Deactivated successfully.
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:30:05 np0005601977 podman[216862]: 2026-01-30 09:30:05.350730539 +0000 UTC m=+0.045966064 container died 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.378 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 30 04:30:05 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4-userdata-shm.mount: Deactivated successfully.
Jan 30 04:30:05 np0005601977 systemd[1]: var-lib-containers-storage-overlay-08b0058770c6cded8151a7cda7fcd7f5137af9ea74c2cd106c44d9e57dba828c-merged.mount: Deactivated successfully.
Jan 30 04:30:05 np0005601977 podman[216862]: 2026-01-30 09:30:05.398971217 +0000 UTC m=+0.094206732 container cleanup 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.400 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 systemd[1]: libpod-conmon-0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4.scope: Deactivated successfully.
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.405 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.428 183134 INFO nova.virt.libvirt.driver [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Instance destroyed successfully.#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.428 183134 DEBUG nova.objects.instance [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 362db3ae-3984-411e-994b-55924dc0c06f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.445 183134 DEBUG nova.virt.libvirt.vif [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:29:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-149082075',display_name='tempest-TestNetworkAdvancedServerOps-server-149082075',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-149082075',id=21,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKizssFknmOKm5eKVAWB6797WgwhrtXfhU+/0cyecNJIu+hHN3gvLXGJMRDzhKYD8/8v9exvNpKsHuhoX+8PPA8mlsBy0hC0QpmrhJ0OUKXCR52DAu2aaKvfZix0Lc+IiQ==',key_name='tempest-TestNetworkAdvancedServerOps-1084377927',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-e0w57m40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:30:01Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=362db3ae-3984-411e-994b-55924dc0c06f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.446 183134 DEBUG nova.network.os_vif_util [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.446 183134 DEBUG nova.network.os_vif_util [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.447 183134 DEBUG os_vif [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.449 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.449 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4433db17-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.451 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.452 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 podman[216899]: 2026-01-30 09:30:05.45228736 +0000 UTC m=+0.037966095 container remove 0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.454 183134 INFO os_vif [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:24:1c,bridge_name='br-int',has_traffic_filtering=True,id=4433db17-a607-4a44-9251-c5e602dc0576,network=Network(60eba810-de66-4c2e-8c3c-70333d77e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4433db17-a6')#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.455 183134 INFO nova.virt.libvirt.driver [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Deleting instance files /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f_del#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.455 183134 INFO nova.virt.libvirt.driver [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Deletion of /var/lib/nova/instances/362db3ae-3984-411e-994b-55924dc0c06f_del complete#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.456 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bda88960-edb8-4380-be9a-709932d89275]: (4, ('Fri Jan 30 09:30:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c (0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4)\n0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4\nFri Jan 30 09:30:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c (0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4)\n0aed171289815b233b57ae13fd90edef65e8881a6091049764cffe6dd79dcdb4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.457 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a37cc907-cf0d-4746-8cb5-4a5312301070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.458 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60eba810-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:05 np0005601977 kernel: tap60eba810-d0: left promiscuous mode
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.460 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.463 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.465 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cbaa3065-c33c-4fe1-b71a-5b2befe6a879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.479 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5fcdc0-6e35-4c10-81bb-ad10e3b81463]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.480 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[30f687ae-6795-4efc-aa91-6c5c3982ec91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.491 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d6feaf5d-a81e-473f-ad90-31a4ddbbdfa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 390745, 'reachable_time': 25606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216925, 'error': None, 'target': 'ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.492 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-60eba810-de66-4c2e-8c3c-70333d77e79c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:30:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:05.492 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[a140278d-8d67-4ef0-9d71-c53aed4403c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:05 np0005601977 systemd[1]: run-netns-ovnmeta\x2d60eba810\x2dde66\x2d4c2e\x2d8c3c\x2d70333d77e79c.mount: Deactivated successfully.
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.545 183134 INFO nova.compute.manager [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.546 183134 DEBUG oslo.service.loopingcall [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.546 183134 DEBUG nova.compute.manager [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:30:05 np0005601977 nova_compute[183130]: 2026-01-30 09:30:05.546 183134 DEBUG nova.network.neutron [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:30:06 np0005601977 nova_compute[183130]: 2026-01-30 09:30:06.413 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:06 np0005601977 nova_compute[183130]: 2026-01-30 09:30:06.414 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:06 np0005601977 nova_compute[183130]: 2026-01-30 09:30:06.414 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:30:06 np0005601977 nova_compute[183130]: 2026-01-30 09:30:06.415 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 93629e5c-ca92-47ac-8567-35d85b4e2a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.224 183134 DEBUG nova.network.neutron [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.270 183134 INFO nova.compute.manager [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Took 1.72 seconds to deallocate network for instance.#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.369 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.370 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.377 183134 DEBUG nova.compute.manager [req-eb57a220-1d86-47b8-90fd-67af2aa05d73 req-5841058d-3331-4cd5-a558-167dde665433 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-vif-deleted-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.446 183134 DEBUG nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-vif-unplugged-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.446 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.446 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.447 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.447 183134 DEBUG nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] No waiting events found dispatching network-vif-unplugged-4433db17-a607-4a44-9251-c5e602dc0576 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.447 183134 WARNING nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received unexpected event network-vif-unplugged-4433db17-a607-4a44-9251-c5e602dc0576 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.447 183134 DEBUG nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.447 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "362db3ae-3984-411e-994b-55924dc0c06f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.448 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.448 183134 DEBUG oslo_concurrency.lockutils [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.448 183134 DEBUG nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] No waiting events found dispatching network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.448 183134 WARNING nova.compute.manager [req-b35f5df4-0b63-4f21-86a0-cadf6c4ebe0b req-3589b333-a966-4d0e-969d-e8e35c64aa10 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Received unexpected event network-vif-plugged-4433db17-a607-4a44-9251-c5e602dc0576 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.520 183134 DEBUG nova.compute.provider_tree [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.541 183134 DEBUG nova.scheduler.client.report [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.562 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.590 183134 INFO nova.scheduler.client.report [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 362db3ae-3984-411e-994b-55924dc0c06f#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.661 183134 DEBUG oslo_concurrency.lockutils [None req-5777713b-4a24-4151-b1a9-39527d940806 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "362db3ae-3984-411e-994b-55924dc0c06f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.490s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.921 183134 DEBUG nova.network.neutron [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updated VIF entry in instance network info cache for port 4433db17-a607-4a44-9251-c5e602dc0576. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.922 183134 DEBUG nova.network.neutron [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Updating instance_info_cache with network_info: [{"id": "4433db17-a607-4a44-9251-c5e602dc0576", "address": "fa:16:3e:1c:24:1c", "network": {"id": "60eba810-de66-4c2e-8c3c-70333d77e79c", "bridge": "br-int", "label": "tempest-network-smoke--656601634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4433db17-a6", "ovs_interfaceid": "4433db17-a607-4a44-9251-c5e602dc0576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:07 np0005601977 nova_compute[183130]: 2026-01-30 09:30:07.946 183134 DEBUG oslo_concurrency.lockutils [req-ca4407b7-bd82-40c1-bb30-c367046d68de req-80bfa0f3-d492-4be2-890c-3c61151b8530 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-362db3ae-3984-411e-994b-55924dc0c06f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.133 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.151 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.151 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.152 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.152 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.152 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.153 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.630 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765393.6296642, 65e07f9f-264b-4e0d-9aa7-f87ebaf84705 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.631 183134 INFO nova.compute.manager [-] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:30:08 np0005601977 nova_compute[183130]: 2026-01-30 09:30:08.740 183134 DEBUG nova.compute.manager [None req-d1e690c6-0910-4390-bedd-92b6e73fc214 - - - - - -] [instance: 65e07f9f-264b-4e0d-9aa7-f87ebaf84705] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:09 np0005601977 nova_compute[183130]: 2026-01-30 09:30:09.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:10 np0005601977 nova_compute[183130]: 2026-01-30 09:30:10.272 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:10 np0005601977 nova_compute[183130]: 2026-01-30 09:30:10.487 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:10 np0005601977 nova_compute[183130]: 2026-01-30 09:30:10.866 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:12 np0005601977 podman[216926]: 2026-01-30 09:30:12.84984021 +0000 UTC m=+0.059274474 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 30 04:30:12 np0005601977 podman[216927]: 2026-01-30 09:30:12.854164374 +0000 UTC m=+0.060440358 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:30:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:15Z|00188|binding|INFO|Releasing lport 66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b from this chassis (sb_readonly=0)
Jan 30 04:30:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:15Z|00189|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:30:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:15Z|00190|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:30:15 np0005601977 nova_compute[183130]: 2026-01-30 09:30:15.154 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:15 np0005601977 nova_compute[183130]: 2026-01-30 09:30:15.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:15 np0005601977 nova_compute[183130]: 2026-01-30 09:30:15.488 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:19 np0005601977 nova_compute[183130]: 2026-01-30 09:30:19.899 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:20 np0005601977 nova_compute[183130]: 2026-01-30 09:30:20.278 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:20 np0005601977 nova_compute[183130]: 2026-01-30 09:30:20.426 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765405.4247267, 362db3ae-3984-411e-994b-55924dc0c06f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:20 np0005601977 nova_compute[183130]: 2026-01-30 09:30:20.426 183134 INFO nova.compute.manager [-] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:30:20 np0005601977 nova_compute[183130]: 2026-01-30 09:30:20.466 183134 DEBUG nova.compute.manager [None req-ca8d20fb-fe62-45e6-a54d-e4999318ee3a - - - - - -] [instance: 362db3ae-3984-411e-994b-55924dc0c06f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:20 np0005601977 nova_compute[183130]: 2026-01-30 09:30:20.536 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:20 np0005601977 podman[216963]: 2026-01-30 09:30:20.84352279 +0000 UTC m=+0.056974408 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:30:20 np0005601977 podman[216962]: 2026-01-30 09:30:20.861037441 +0000 UTC m=+0.079293207 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:30:24 np0005601977 nova_compute[183130]: 2026-01-30 09:30:24.962 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:25 np0005601977 nova_compute[183130]: 2026-01-30 09:30:25.281 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:25 np0005601977 nova_compute[183130]: 2026-01-30 09:30:25.537 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:25 np0005601977 podman[217002]: 2026-01-30 09:30:25.876585787 +0000 UTC m=+0.080203822 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.078 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.080 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.102 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.199 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.199 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.205 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.205 183134 INFO nova.compute.claims [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.377 183134 DEBUG nova.compute.provider_tree [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.391 183134 DEBUG nova.scheduler.client.report [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.410 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.410 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.459 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.460 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.481 183134 INFO nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.499 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.599 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.601 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.601 183134 INFO nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Creating image(s)#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.602 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.602 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.604 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.617 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.659 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.660 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.660 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.673 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.706 183134 DEBUG nova.policy [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.726 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.726 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.757 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.759 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.759 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.816 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.817 183134 DEBUG nova.virt.disk.api [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.817 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.871 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.872 183134 DEBUG nova.virt.disk.api [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.872 183134 DEBUG nova.objects.instance [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 510e72ed-ac04-4a15-babc-98d067a699fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.889 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.890 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Ensure instance console log exists: /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.890 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.890 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:27 np0005601977 nova_compute[183130]: 2026-01-30 09:30:27.891 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.142 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Successfully created port: 24438658-5388-4fa2-a1bb-4a7cc225f3ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.400 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.935 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Successfully updated port: 24438658-5388-4fa2-a1bb-4a7cc225f3ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.952 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.953 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:29 np0005601977 nova_compute[183130]: 2026-01-30 09:30:29.953 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.035 183134 DEBUG nova.compute.manager [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.036 183134 DEBUG nova.compute.manager [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing instance network info cache due to event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.036 183134 DEBUG oslo_concurrency.lockutils [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.105 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.283 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:30 np0005601977 nova_compute[183130]: 2026-01-30 09:30:30.538 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.146 183134 DEBUG nova.network.neutron [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updating instance_info_cache with network_info: [{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.164 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.165 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Instance network_info: |[{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.165 183134 DEBUG oslo_concurrency.lockutils [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.165 183134 DEBUG nova.network.neutron [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.168 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Start _get_guest_xml network_info=[{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.173 183134 WARNING nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.179 183134 DEBUG nova.virt.libvirt.host [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.180 183134 DEBUG nova.virt.libvirt.host [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.187 183134 DEBUG nova.virt.libvirt.host [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.188 183134 DEBUG nova.virt.libvirt.host [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.189 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.190 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.190 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.190 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.190 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.190 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.191 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.191 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.191 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.191 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.191 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.192 183134 DEBUG nova.virt.hardware [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.195 183134 DEBUG nova.virt.libvirt.vif [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:30:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1756779848',display_name='tempest-TestNetworkBasicOps-server-1756779848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1756779848',id=23,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7nBvfqfwGy2liDEg9UFi8jz2EW5wv6a3dNMz1Fig7uIT7wmLwG0gyZV3tPLN/mXtBT4KR4PDkroFmb3OmmVcTwxuN/C1qocWLf9C43upOGgL+ReeZ+dQ/I6+UAhTowqw==',key_name='tempest-TestNetworkBasicOps-828198191',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-b0z09rz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:30:27Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=510e72ed-ac04-4a15-babc-98d067a699fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.196 183134 DEBUG nova.network.os_vif_util [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.197 183134 DEBUG nova.network.os_vif_util [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.198 183134 DEBUG nova.objects.instance [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 510e72ed-ac04-4a15-babc-98d067a699fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.215 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <uuid>510e72ed-ac04-4a15-babc-98d067a699fa</uuid>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <name>instance-00000017</name>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-1756779848</nova:name>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:30:31</nova:creationTime>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        <nova:port uuid="24438658-5388-4fa2-a1bb-4a7cc225f3ca">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="serial">510e72ed-ac04-4a15-babc-98d067a699fa</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="uuid">510e72ed-ac04-4a15-babc-98d067a699fa</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.config"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:09:b0:a8"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <target dev="tap24438658-53"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/console.log" append="off"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:30:31 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:30:31 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:30:31 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:30:31 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.215 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Preparing to wait for external event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.215 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.216 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.216 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.217 183134 DEBUG nova.virt.libvirt.vif [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:30:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1756779848',display_name='tempest-TestNetworkBasicOps-server-1756779848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1756779848',id=23,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7nBvfqfwGy2liDEg9UFi8jz2EW5wv6a3dNMz1Fig7uIT7wmLwG0gyZV3tPLN/mXtBT4KR4PDkroFmb3OmmVcTwxuN/C1qocWLf9C43upOGgL+ReeZ+dQ/I6+UAhTowqw==',key_name='tempest-TestNetworkBasicOps-828198191',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-b0z09rz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:30:27Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=510e72ed-ac04-4a15-babc-98d067a699fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.217 183134 DEBUG nova.network.os_vif_util [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.218 183134 DEBUG nova.network.os_vif_util [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.218 183134 DEBUG os_vif [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.218 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.219 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.219 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.223 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.224 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24438658-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.224 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap24438658-53, col_values=(('external_ids', {'iface-id': '24438658-5388-4fa2-a1bb-4a7cc225f3ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:b0:a8', 'vm-uuid': '510e72ed-ac04-4a15-babc-98d067a699fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.257 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 NetworkManager[55565]: <info>  [1769765431.2580] manager: (tap24438658-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.261 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.265 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.266 183134 INFO os_vif [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53')#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.331 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.332 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.332 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:09:b0:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.332 183134 INFO nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Using config drive#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.697 183134 INFO nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Creating config drive at /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.config#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.700 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr3m171x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.819 183134 DEBUG oslo_concurrency.processutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr3m171x" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:30:31 np0005601977 kernel: tap24438658-53: entered promiscuous mode
Jan 30 04:30:31 np0005601977 NetworkManager[55565]: <info>  [1769765431.8719] manager: (tap24438658-53): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Jan 30 04:30:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:31Z|00191|binding|INFO|Claiming lport 24438658-5388-4fa2-a1bb-4a7cc225f3ca for this chassis.
Jan 30 04:30:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:31Z|00192|binding|INFO|24438658-5388-4fa2-a1bb-4a7cc225f3ca: Claiming fa:16:3e:09:b0:a8 10.100.0.10
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.872 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:31Z|00193|binding|INFO|Setting lport 24438658-5388-4fa2-a1bb-4a7cc225f3ca ovn-installed in OVS
Jan 30 04:30:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:31Z|00194|binding|INFO|Setting lport 24438658-5388-4fa2-a1bb-4a7cc225f3ca up in Southbound
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.886 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:b0:a8 10.100.0.10'], port_security=['fa:16:3e:09:b0:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '510e72ed-ac04-4a15-babc-98d067a699fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf07673-f284-4722-a20b-66fb3bba1a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c0bd8cd5-4cca-4fa1-8f73-177b23e189a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7912079a-8212-4607-b4d0-2d35aabace59, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=24438658-5388-4fa2-a1bb-4a7cc225f3ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.888 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 24438658-5388-4fa2-a1bb-4a7cc225f3ca in datapath 4bf07673-f284-4722-a20b-66fb3bba1a03 bound to our chassis#033[00m
Jan 30 04:30:31 np0005601977 nova_compute[183130]: 2026-01-30 09:30:31.889 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.890 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bf07673-f284-4722-a20b-66fb3bba1a03#033[00m
Jan 30 04:30:31 np0005601977 systemd-machined[154431]: New machine qemu-15-instance-00000017.
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.901 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2bcc7de3-5f1f-4f82-b3fe-dd80f28968f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.903 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4bf07673-f1 in ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.906 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4bf07673-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.906 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c41227-372e-4da9-9118-a7cf7f1d2b40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.908 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[128c721b-f7d6-4c5e-aa7a-02232d05c50d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 systemd[1]: Started Virtual Machine qemu-15-instance-00000017.
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.918 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[048bb800-1e89-427d-8f15-33ebec401864]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 systemd-udevd[217065]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.932 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[38456a30-e9c2-437b-9076-0f55c0f07d9f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 NetworkManager[55565]: <info>  [1769765431.9409] device (tap24438658-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:30:31 np0005601977 NetworkManager[55565]: <info>  [1769765431.9419] device (tap24438658-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.965 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9f827e-3e6c-4c1c-aa96-97b43b06d3c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:31 np0005601977 NetworkManager[55565]: <info>  [1769765431.9731] manager: (tap4bf07673-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Jan 30 04:30:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:31.972 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b7453c4b-51b4-48f9-8aaf-b6a87c9e3349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.005 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[03c7c25e-9513-4f3f-8d31-ccc29716c7c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.009 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[40d8f607-60ed-4d76-b9f5-e6b2bffac105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 NetworkManager[55565]: <info>  [1769765432.0297] device (tap4bf07673-f0): carrier: link connected
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.034 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4ca045-f917-4847-8c3c-011294ccd578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.048 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ab788dbf-7508-495f-bacd-308c92174b48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf07673-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:9d:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396336, 'reachable_time': 36063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217096, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.061 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4e960f-f30a-44f5-bf1f-7d57a15c1ba5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:9d8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396336, 'tstamp': 396336}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217097, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.078 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[296b1b71-db7d-40ae-b34e-e9fec1719b9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf07673-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:9d:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396336, 'reachable_time': 36063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217098, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.112 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0be9d7c5-a626-4825-b3e9-eccc748147fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.162 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[31df9436-21ea-4402-86f8-1d70e19285ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.163 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf07673-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.163 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.163 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bf07673-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.165 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:32 np0005601977 kernel: tap4bf07673-f0: entered promiscuous mode
Jan 30 04:30:32 np0005601977 NetworkManager[55565]: <info>  [1769765432.1658] manager: (tap4bf07673-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.168 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.169 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bf07673-f0, col_values=(('external_ids', {'iface-id': '7e936c7b-5f33-4a17-8358-59044e61e6d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.170 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:32Z|00195|binding|INFO|Releasing lport 7e936c7b-5f33-4a17-8358-59044e61e6d1 from this chassis (sb_readonly=0)
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.171 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.172 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bf07673-f284-4722-a20b-66fb3bba1a03.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bf07673-f284-4722-a20b-66fb3bba1a03.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.173 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[066834ae-f5fd-401b-a3fd-25d1a132b12e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.173 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-4bf07673-f284-4722-a20b-66fb3bba1a03
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/4bf07673-f284-4722-a20b-66fb3bba1a03.pid.haproxy
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 4bf07673-f284-4722-a20b-66fb3bba1a03
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:30:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:32.174 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'env', 'PROCESS_TAG=haproxy-4bf07673-f284-4722-a20b-66fb3bba1a03', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4bf07673-f284-4722-a20b-66fb3bba1a03.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.175 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.414 183134 DEBUG nova.network.neutron [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updated VIF entry in instance network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.415 183134 DEBUG nova.network.neutron [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updating instance_info_cache with network_info: [{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.436 183134 DEBUG oslo_concurrency.lockutils [req-503b34cb-feba-4dca-9e97-d723a047e2ea req-4181796b-a3ae-448b-9869-cd4dd250cab3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:32 np0005601977 podman[217130]: 2026-01-30 09:30:32.509019006 +0000 UTC m=+0.061914250 container create 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:30:32 np0005601977 systemd[1]: Started libpod-conmon-1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86.scope.
Jan 30 04:30:32 np0005601977 podman[217130]: 2026-01-30 09:30:32.473489921 +0000 UTC m=+0.026385185 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:30:32 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:30:32 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/337789b96227970c4256f75367d45c3a773007aaf88b458349f5a12b261a253f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:30:32 np0005601977 podman[217130]: 2026-01-30 09:30:32.592042538 +0000 UTC m=+0.144937742 container init 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:30:32 np0005601977 podman[217130]: 2026-01-30 09:30:32.596838085 +0000 UTC m=+0.149733289 container start 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:30:32 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [NOTICE]   (217156) : New worker (217158) forked
Jan 30 04:30:32 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [NOTICE]   (217156) : Loading success.
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.626 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765432.6253092, 510e72ed-ac04-4a15-babc-98d067a699fa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.626 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] VM Started (Lifecycle Event)#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.646 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.650 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765432.6256838, 510e72ed-ac04-4a15-babc-98d067a699fa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.650 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.669 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.672 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:30:32 np0005601977 nova_compute[183130]: 2026-01-30 09:30:32.692 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:30:35 np0005601977 nova_compute[183130]: 2026-01-30 09:30:35.332 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:35 np0005601977 podman[217167]: 2026-01-30 09:30:35.439776019 +0000 UTC m=+0.072964176 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.258 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.589 183134 DEBUG nova.compute.manager [req-7d9bebdb-cb8e-47ba-a4cf-2de80133d91f req-96b38315-fe80-421f-80ad-251a540a1ef9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.589 183134 DEBUG oslo_concurrency.lockutils [req-7d9bebdb-cb8e-47ba-a4cf-2de80133d91f req-96b38315-fe80-421f-80ad-251a540a1ef9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.589 183134 DEBUG oslo_concurrency.lockutils [req-7d9bebdb-cb8e-47ba-a4cf-2de80133d91f req-96b38315-fe80-421f-80ad-251a540a1ef9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.589 183134 DEBUG oslo_concurrency.lockutils [req-7d9bebdb-cb8e-47ba-a4cf-2de80133d91f req-96b38315-fe80-421f-80ad-251a540a1ef9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.590 183134 DEBUG nova.compute.manager [req-7d9bebdb-cb8e-47ba-a4cf-2de80133d91f req-96b38315-fe80-421f-80ad-251a540a1ef9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Processing event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.590 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.595 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765436.595238, 510e72ed-ac04-4a15-babc-98d067a699fa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.596 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.600 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.604 183134 INFO nova.virt.libvirt.driver [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Instance spawned successfully.#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.604 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.628 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.634 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.639 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.640 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.640 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.641 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.641 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.642 183134 DEBUG nova.virt.libvirt.driver [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.668 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.710 183134 INFO nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Took 9.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.711 183134 DEBUG nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.820 183134 INFO nova.compute.manager [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Took 9.65 seconds to build instance.#033[00m
Jan 30 04:30:36 np0005601977 nova_compute[183130]: 2026-01-30 09:30:36.845 183134 DEBUG oslo_concurrency.lockutils [None req-3ffb2565-d690-4ca9-9321-4b721b46e943 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.684 183134 DEBUG nova.compute.manager [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.684 183134 DEBUG oslo_concurrency.lockutils [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.685 183134 DEBUG oslo_concurrency.lockutils [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.685 183134 DEBUG oslo_concurrency.lockutils [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.685 183134 DEBUG nova.compute.manager [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] No waiting events found dispatching network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:30:38 np0005601977 nova_compute[183130]: 2026-01-30 09:30:38.686 183134 WARNING nova.compute.manager [req-882e0ceb-b6bd-405f-8bfc-437ad0486e9b req-abe20603-7cc8-49c9-bd96-0796c6457eb0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received unexpected event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca for instance with vm_state active and task_state None.#033[00m
Jan 30 04:30:40 np0005601977 nova_compute[183130]: 2026-01-30 09:30:40.090 183134 INFO nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Post operation of migration started#033[00m
Jan 30 04:30:40 np0005601977 nova_compute[183130]: 2026-01-30 09:30:40.334 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:40 np0005601977 nova_compute[183130]: 2026-01-30 09:30:40.650 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:40 np0005601977 nova_compute[183130]: 2026-01-30 09:30:40.651 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquired lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:40 np0005601977 nova_compute[183130]: 2026-01-30 09:30:40.651 183134 DEBUG nova.network.neutron [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:30:41 np0005601977 nova_compute[183130]: 2026-01-30 09:30:41.260 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:42.891 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:42 np0005601977 nova_compute[183130]: 2026-01-30 09:30:42.891 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:42.892 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.613 183134 DEBUG nova.network.neutron [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Updating instance_info_cache with network_info: [{"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.637 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Releasing lock "refresh_cache-22bc0323-ee7d-4b6e-992e-a2410bf240e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.655 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.656 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.656 183134 DEBUG oslo_concurrency.lockutils [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:43 np0005601977 nova_compute[183130]: 2026-01-30 09:30:43.660 183134 INFO nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 30 04:30:43 np0005601977 virtqemud[182587]: Domain id=14 name='instance-00000010' uuid=22bc0323-ee7d-4b6e-992e-a2410bf240e6 is tainted: custom-monitor
Jan 30 04:30:43 np0005601977 podman[217192]: 2026-01-30 09:30:43.849976029 +0000 UTC m=+0.063414563 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:30:43 np0005601977 podman[217191]: 2026-01-30 09:30:43.855809156 +0000 UTC m=+0.069098425 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.7, config_id=openstack_network_exporter)
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.438 183134 DEBUG nova.compute.manager [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.438 183134 DEBUG nova.compute.manager [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing instance network info cache due to event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.439 183134 DEBUG oslo_concurrency.lockutils [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.439 183134 DEBUG oslo_concurrency.lockutils [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.440 183134 DEBUG nova.network.neutron [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:30:44 np0005601977 nova_compute[183130]: 2026-01-30 09:30:44.668 183134 INFO nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 30 04:30:45 np0005601977 nova_compute[183130]: 2026-01-30 09:30:45.371 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:45 np0005601977 nova_compute[183130]: 2026-01-30 09:30:45.675 183134 INFO nova.virt.libvirt.driver [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 30 04:30:45 np0005601977 nova_compute[183130]: 2026-01-30 09:30:45.683 183134 DEBUG nova.compute.manager [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:30:45 np0005601977 nova_compute[183130]: 2026-01-30 09:30:45.715 183134 DEBUG nova.objects.instance [None req-5d8a028f-7f6a-48a2-8c1f-0746337651e2 d30bc373bf83429795aef03c2b5caabc 75da33fbff7a4f64b66bb8d4119e3f35 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 30 04:30:46 np0005601977 nova_compute[183130]: 2026-01-30 09:30:46.262 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:46 np0005601977 nova_compute[183130]: 2026-01-30 09:30:46.322 183134 DEBUG nova.network.neutron [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updated VIF entry in instance network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:30:46 np0005601977 nova_compute[183130]: 2026-01-30 09:30:46.323 183134 DEBUG nova.network.neutron [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updating instance_info_cache with network_info: [{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:46 np0005601977 nova_compute[183130]: 2026-01-30 09:30:46.342 183134 DEBUG oslo_concurrency.lockutils [req-3e9de79c-937e-48ac-b120-a138d92c4e61 req-eca96ded-7de0-4015-ab92-4aca24135145 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:48Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:b0:a8 10.100.0.10
Jan 30 04:30:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:48Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:b0:a8 10.100.0.10
Jan 30 04:30:48 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:48.895 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.029 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.030 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.030 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.030 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.030 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.032 183134 INFO nova.compute.manager [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Terminating instance#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.033 183134 DEBUG nova.compute.manager [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:30:49 np0005601977 kernel: tape680749e-01 (unregistering): left promiscuous mode
Jan 30 04:30:49 np0005601977 NetworkManager[55565]: <info>  [1769765449.0560] device (tape680749e-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00196|binding|INFO|Releasing lport e680749e-01e2-462e-8755-8b4f01e1272e from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00197|binding|INFO|Setting lport e680749e-01e2-462e-8755-8b4f01e1272e down in Southbound
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00198|binding|INFO|Releasing lport 40336582-d4ab-46e5-9089-cf09f796f51f from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00199|binding|INFO|Setting lport 40336582-d4ab-46e5-9089-cf09f796f51f down in Southbound
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.067 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00200|binding|INFO|Removing iface tape680749e-01 ovn-installed in OVS
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.072 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00201|binding|INFO|Releasing lport 66227ea9-e1c4-4b2a-8e46-c63d6ca3d55b from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00202|binding|INFO|Releasing lport 7e936c7b-5f33-4a17-8358-59044e61e6d1 from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00203|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_controller[95460]: 2026-01-30T09:30:49Z|00204|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.078 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:37:22 10.100.0.5'], port_security=['fa:16:3e:92:37:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-547456304', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '22bc0323-ee7d-4b6e-992e-a2410bf240e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-547456304', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=e680749e-01e2-462e-8755-8b4f01e1272e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.080 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:cf:3e 19.80.0.151'], port_security=['fa:16:3e:b4:cf:3e 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['e680749e-01e2-462e-8755-8b4f01e1272e'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-367508310', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01f97378-9667-4aa0-9b75-db68873adbbb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-367508310', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ca9da3af-412a-43bc-885e-95d28caf9a34, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40336582-d4ab-46e5-9089-cf09f796f51f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.081 104706 INFO neutron.agent.ovn.metadata.agent [-] Port e680749e-01e2-462e-8755-8b4f01e1272e in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 unbound from our chassis#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.083 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.095 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[66bde821-1382-4858-9239-7ead08caf746]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 30 04:30:49 np0005601977 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Consumed 3.713s CPU time.
Jan 30 04:30:49 np0005601977 systemd-machined[154431]: Machine qemu-14-instance-00000010 terminated.
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.110 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.115 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.121 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[339438b4-95a6-4ee2-bb71-2e60e1fb8f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.125 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[26fd71b5-0b7a-4f95-a099-af4278994d17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.146 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8be10d70-1e99-48e9-984f-69b188107aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.160 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9008bcf6-513e-4c9b-907f-5cdd1c41da0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e3ea2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:a6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 41, 'tx_packets': 11, 'rx_bytes': 2002, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 41, 'tx_packets': 11, 'rx_bytes': 2002, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366983, 'reachable_time': 24925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217257, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.176 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[86647aa7-2ba3-4262-a57f-038a4664fbb6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366992, 'tstamp': 366992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217258, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e3ea2-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366995, 'tstamp': 366995}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217258, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.178 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.180 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.184 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.185 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e3ea2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.185 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.186 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e3ea2-50, col_values=(('external_ids', {'iface-id': '15b4d9a6-bad1-4bf8-a262-02e27eb8ea93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.187 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.188 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 40336582-d4ab-46e5-9089-cf09f796f51f in datapath 01f97378-9667-4aa0-9b75-db68873adbbb unbound from our chassis#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.192 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01f97378-9667-4aa0-9b75-db68873adbbb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.193 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3569656b-99ea-419d-a225-e8f3d462a44e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.194 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb namespace which is not needed anymore#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.289 183134 INFO nova.virt.libvirt.driver [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Instance destroyed successfully.#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.290 183134 DEBUG nova.objects.instance [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lazy-loading 'resources' on Instance uuid 22bc0323-ee7d-4b6e-992e-a2410bf240e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.305 183134 DEBUG nova.virt.libvirt.vif [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:27:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-481398456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-481398456',id=16,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-5f3v94ca',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:30:45Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=22bc0323-ee7d-4b6e-992e-a2410bf240e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.307 183134 DEBUG nova.network.os_vif_util [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converting VIF {"id": "e680749e-01e2-462e-8755-8b4f01e1272e", "address": "fa:16:3e:92:37:22", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape680749e-01", "ovs_interfaceid": "e680749e-01e2-462e-8755-8b4f01e1272e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.308 183134 DEBUG nova.network.os_vif_util [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.309 183134 DEBUG os_vif [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.311 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.311 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape680749e-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.313 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [NOTICE]   (216458) : haproxy version is 2.8.14-c23fe91
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [NOTICE]   (216458) : path to executable is /usr/sbin/haproxy
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [WARNING]  (216458) : Exiting Master process...
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [WARNING]  (216458) : Exiting Master process...
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [ALERT]    (216458) : Current worker (216460) exited with code 143 (Terminated)
Jan 30 04:30:49 np0005601977 neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb[216454]: [WARNING]  (216458) : All workers exited. Exiting... (0)
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.316 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:30:49 np0005601977 systemd[1]: libpod-adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13.scope: Deactivated successfully.
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.319 183134 INFO os_vif [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:37:22,bridge_name='br-int',has_traffic_filtering=True,id=e680749e-01e2-462e-8755-8b4f01e1272e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape680749e-01')#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.320 183134 INFO nova.virt.libvirt.driver [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Deleting instance files /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6_del#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.320 183134 INFO nova.virt.libvirt.driver [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Deletion of /var/lib/nova/instances/22bc0323-ee7d-4b6e-992e-a2410bf240e6_del complete#033[00m
Jan 30 04:30:49 np0005601977 podman[217287]: 2026-01-30 09:30:49.326484676 +0000 UTC m=+0.051329778 container died adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:30:49 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13-userdata-shm.mount: Deactivated successfully.
Jan 30 04:30:49 np0005601977 systemd[1]: var-lib-containers-storage-overlay-da8690ca0655ddc1bfef94f414d030808ee7e2e809b48b427b8c6b1b61990be0-merged.mount: Deactivated successfully.
Jan 30 04:30:49 np0005601977 podman[217287]: 2026-01-30 09:30:49.358896551 +0000 UTC m=+0.083741653 container cleanup adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.369 183134 INFO nova.compute.manager [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.370 183134 DEBUG oslo.service.loopingcall [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.371 183134 DEBUG nova.compute.manager [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.371 183134 DEBUG nova.network.neutron [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:30:49 np0005601977 systemd[1]: libpod-conmon-adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13.scope: Deactivated successfully.
Jan 30 04:30:49 np0005601977 podman[217324]: 2026-01-30 09:30:49.403469955 +0000 UTC m=+0.030811692 container remove adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.407 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[874fde6d-43b5-4914-bae3-d3932977a617]: (4, ('Fri Jan 30 09:30:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb (adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13)\nadaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13\nFri Jan 30 09:30:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb (adaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13)\nadaeba3e1fb991dbb8187fe8169419495ede1175413845e634ffb44fbc2e2b13\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.408 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c0cd328a-58b5-45fe-81c7-c1591130c33c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.409 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01f97378-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.410 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 kernel: tap01f97378-90: left promiscuous mode
Jan 30 04:30:49 np0005601977 nova_compute[183130]: 2026-01-30 09:30:49.414 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.417 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bce0eb85-309a-4a78-a1cc-478fc33e9e65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.444 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[97cd9c29-f2f0-44d6-a6e5-cbc7dd7b399a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.445 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[18240ced-1026-460e-8842-f67a9dfba1f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.454 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[79595961-86e7-4f58-991b-59b8eae64e49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391745, 'reachable_time': 25406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217339, 'error': None, 'target': 'ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.456 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01f97378-9667-4aa0-9b75-db68873adbbb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:30:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:49.456 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[82c6aedd-419a-4709-9719-05f2acdc5058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:30:49 np0005601977 systemd[1]: run-netns-ovnmeta\x2d01f97378\x2d9667\x2d4aa0\x2d9b75\x2ddb68873adbbb.mount: Deactivated successfully.
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.375 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.833 183134 DEBUG nova.compute.manager [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Received event network-vif-unplugged-e680749e-01e2-462e-8755-8b4f01e1272e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.834 183134 DEBUG oslo_concurrency.lockutils [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.834 183134 DEBUG oslo_concurrency.lockutils [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.835 183134 DEBUG oslo_concurrency.lockutils [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.835 183134 DEBUG nova.compute.manager [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] No waiting events found dispatching network-vif-unplugged-e680749e-01e2-462e-8755-8b4f01e1272e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:30:50 np0005601977 nova_compute[183130]: 2026-01-30 09:30:50.836 183134 DEBUG nova.compute.manager [req-b9856605-a376-4f1d-bcf3-7be0162aa63e req-6ad7e7ce-7079-4878-9038-c9863f14472a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Received event network-vif-unplugged-e680749e-01e2-462e-8755-8b4f01e1272e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:30:51 np0005601977 podman[217340]: 2026-01-30 09:30:51.83283475 +0000 UTC m=+0.045046098 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 30 04:30:51 np0005601977 podman[217341]: 2026-01-30 09:30:51.850708441 +0000 UTC m=+0.053150930 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.551 183134 DEBUG nova.network.neutron [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.567 183134 INFO nova.compute.manager [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Took 3.20 seconds to deallocate network for instance.#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.608 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.608 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.612 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.649 183134 INFO nova.scheduler.client.report [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Deleted allocations for instance 22bc0323-ee7d-4b6e-992e-a2410bf240e6#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.728 183134 DEBUG oslo_concurrency.lockutils [None req-32220571-cab0-4aec-b277-ab3dbdb303d5 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.917 183134 DEBUG nova.compute.manager [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Received event network-vif-plugged-e680749e-01e2-462e-8755-8b4f01e1272e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.917 183134 DEBUG oslo_concurrency.lockutils [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.918 183134 DEBUG oslo_concurrency.lockutils [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.918 183134 DEBUG oslo_concurrency.lockutils [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22bc0323-ee7d-4b6e-992e-a2410bf240e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.918 183134 DEBUG nova.compute.manager [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] No waiting events found dispatching network-vif-plugged-e680749e-01e2-462e-8755-8b4f01e1272e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:30:52 np0005601977 nova_compute[183130]: 2026-01-30 09:30:52.918 183134 WARNING nova.compute.manager [req-b30f4b6f-2187-4f34-aed2-abece49fc8f3 req-c3bca5c3-9dc5-4ffe-9560-d78dc0b8c559 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Received unexpected event network-vif-plugged-e680749e-01e2-462e-8755-8b4f01e1272e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:30:54 np0005601977 nova_compute[183130]: 2026-01-30 09:30:54.314 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:54 np0005601977 nova_compute[183130]: 2026-01-30 09:30:54.699 183134 INFO nova.compute.manager [None req-26d76ccc-ba15-4df0-bfaa-94920fe596ea a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Get console output#033[00m
Jan 30 04:30:54 np0005601977 nova_compute[183130]: 2026-01-30 09:30:54.704 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:30:55 np0005601977 nova_compute[183130]: 2026-01-30 09:30:55.377 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:30:56 np0005601977 nova_compute[183130]: 2026-01-30 09:30:56.285 183134 DEBUG nova.compute.manager [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:30:56 np0005601977 nova_compute[183130]: 2026-01-30 09:30:56.285 183134 DEBUG nova.compute.manager [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing instance network info cache due to event network-changed-24438658-5388-4fa2-a1bb-4a7cc225f3ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:30:56 np0005601977 nova_compute[183130]: 2026-01-30 09:30:56.286 183134 DEBUG oslo_concurrency.lockutils [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:30:56 np0005601977 nova_compute[183130]: 2026-01-30 09:30:56.286 183134 DEBUG oslo_concurrency.lockutils [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:30:56 np0005601977 nova_compute[183130]: 2026-01-30 09:30:56.286 183134 DEBUG nova.network.neutron [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Refreshing network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:30:56 np0005601977 podman[217382]: 2026-01-30 09:30:56.918044926 +0000 UTC m=+0.132776695 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:30:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:57.383 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:57.383 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:30:57.384 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:57 np0005601977 nova_compute[183130]: 2026-01-30 09:30:57.947 183134 DEBUG nova.network.neutron [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updated VIF entry in instance network info cache for port 24438658-5388-4fa2-a1bb-4a7cc225f3ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:30:57 np0005601977 nova_compute[183130]: 2026-01-30 09:30:57.948 183134 DEBUG nova.network.neutron [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updating instance_info_cache with network_info: [{"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:30:58 np0005601977 nova_compute[183130]: 2026-01-30 09:30:58.044 183134 DEBUG oslo_concurrency.lockutils [req-2b8c2674-6fe7-4f75-a108-4de5dbc9672a req-370ccf95-3679-4ccc-93b3-04d0b78be238 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-510e72ed-ac04-4a15-babc-98d067a699fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:30:59 np0005601977 nova_compute[183130]: 2026-01-30 09:30:59.318 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.365 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.366 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.380 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.457 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.504 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.505 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.550 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.555 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.596 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.597 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.648 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.654 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.701 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.702 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.748 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.902 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.904 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5141MB free_disk=73.27313232421875GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.904 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.904 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.997 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 7a073e24-c800-4962-af5e-ff5400800f34 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.998 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 93629e5c-ca92-47ac-8567-35d85b4e2a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.998 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 510e72ed-ac04-4a15-babc-98d067a699fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.998 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:31:00 np0005601977 nova_compute[183130]: 2026-01-30 09:31:00.998 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:31:01 np0005601977 nova_compute[183130]: 2026-01-30 09:31:01.078 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:01 np0005601977 nova_compute[183130]: 2026-01-30 09:31:01.097 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:01 np0005601977 nova_compute[183130]: 2026-01-30 09:31:01.138 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:31:01 np0005601977 nova_compute[183130]: 2026-01-30 09:31:01.138 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.113 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.114 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.267 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.287 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765449.2859154, 22bc0323-ee7d-4b6e-992e-a2410bf240e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.287 183134 INFO nova.compute.manager [-] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.320 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.340 183134 DEBUG nova.compute.manager [None req-4ea0bfef-e57f-4050-bcbc-551c8be2cca4 - - - - - -] [instance: 22bc0323-ee7d-4b6e-992e-a2410bf240e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.398 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.399 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.407 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.407 183134 INFO nova.compute.claims [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.702 183134 DEBUG nova.compute.provider_tree [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.724 183134 DEBUG nova.scheduler.client.report [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.750 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.751 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.793 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.794 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.833 183134 INFO nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:31:04 np0005601977 nova_compute[183130]: 2026-01-30 09:31:04.885 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.023 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.025 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.025 183134 INFO nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Creating image(s)#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.026 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.027 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.028 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.066 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.116 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.117 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.117 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.132 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.150 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.182 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.188 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.189 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.216 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.217 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.217 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.285 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.286 183134 DEBUG nova.virt.disk.api [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.287 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.339 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.340 183134 DEBUG nova.virt.disk.api [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.340 183134 DEBUG nova.objects.instance [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid d7ac9bd2-23a9-4add-b755-5a26fdfe7862 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.421 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.497 183134 DEBUG nova.policy [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.503 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.503 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Ensure instance console log exists: /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.503 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.504 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:05 np0005601977 nova_compute[183130]: 2026-01-30 09:31:05.504 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:05 np0005601977 podman[217444]: 2026-01-30 09:31:05.832955037 +0000 UTC m=+0.050074612 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:31:06 np0005601977 nova_compute[183130]: 2026-01-30 09:31:06.307 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Successfully created port: 9021a9ab-13d1-4709-b167-1971ec480bf2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:31:06 np0005601977 nova_compute[183130]: 2026-01-30 09:31:06.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.534 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.534 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.534 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.561 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Successfully updated port: 9021a9ab-13d1-4709-b167-1971ec480bf2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.579 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.580 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.580 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.664 183134 DEBUG nova.compute.manager [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-changed-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.665 183134 DEBUG nova.compute.manager [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Refreshing instance network info cache due to event network-changed-9021a9ab-13d1-4709-b167-1971ec480bf2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.665 183134 DEBUG oslo_concurrency.lockutils [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:07 np0005601977 nova_compute[183130]: 2026-01-30 09:31:07.709 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.321 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.430 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [{"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.432 183134 DEBUG nova.network.neutron [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updating instance_info_cache with network_info: [{"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.472 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-7a073e24-c800-4962-af5e-ff5400800f34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.473 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.473 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.473 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.474 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.474 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.493 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.493 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Instance network_info: |[{"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.494 183134 DEBUG oslo_concurrency.lockutils [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.495 183134 DEBUG nova.network.neutron [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Refreshing network info cache for port 9021a9ab-13d1-4709-b167-1971ec480bf2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.500 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Start _get_guest_xml network_info=[{"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.505 183134 WARNING nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.510 183134 DEBUG nova.virt.libvirt.host [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.510 183134 DEBUG nova.virt.libvirt.host [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.514 183134 DEBUG nova.virt.libvirt.host [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.514 183134 DEBUG nova.virt.libvirt.host [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.515 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.515 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.515 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.516 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.516 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.516 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.516 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.517 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.517 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.517 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.517 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.518 183134 DEBUG nova.virt.hardware [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.520 183134 DEBUG nova.virt.libvirt.vif [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-286632656',display_name='tempest-TestNetworkBasicOps-server-286632656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-286632656',id=25,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEM2OHCbWaK/MnKdQvTKptgEKy/m0p5KRD77VXmCkl08WjI8yjevXTAPFufS0hp9fBZ0qOW+G5AJXmRQITOCr5T10zMe2ByXVJ/U8WPNENkNQLd7a7k17NnvLLM/4N5GQ==',key_name='tempest-TestNetworkBasicOps-1194918393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-khti0vix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:04Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=d7ac9bd2-23a9-4add-b755-5a26fdfe7862,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.521 183134 DEBUG nova.network.os_vif_util [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.521 183134 DEBUG nova.network.os_vif_util [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.522 183134 DEBUG nova.objects.instance [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid d7ac9bd2-23a9-4add-b755-5a26fdfe7862 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.592 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <uuid>d7ac9bd2-23a9-4add-b755-5a26fdfe7862</uuid>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <name>instance-00000019</name>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-286632656</nova:name>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:31:09</nova:creationTime>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        <nova:port uuid="9021a9ab-13d1-4709-b167-1971ec480bf2">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="serial">d7ac9bd2-23a9-4add-b755-5a26fdfe7862</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="uuid">d7ac9bd2-23a9-4add-b755-5a26fdfe7862</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.config"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:17:1d:49"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <target dev="tap9021a9ab-13"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/console.log" append="off"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:31:09 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:31:09 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:31:09 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:31:09 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.593 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Preparing to wait for external event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.593 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.593 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.593 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.594 183134 DEBUG nova.virt.libvirt.vif [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-286632656',display_name='tempest-TestNetworkBasicOps-server-286632656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-286632656',id=25,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEM2OHCbWaK/MnKdQvTKptgEKy/m0p5KRD77VXmCkl08WjI8yjevXTAPFufS0hp9fBZ0qOW+G5AJXmRQITOCr5T10zMe2ByXVJ/U8WPNENkNQLd7a7k17NnvLLM/4N5GQ==',key_name='tempest-TestNetworkBasicOps-1194918393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-khti0vix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:04Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=d7ac9bd2-23a9-4add-b755-5a26fdfe7862,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.594 183134 DEBUG nova.network.os_vif_util [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.595 183134 DEBUG nova.network.os_vif_util [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.595 183134 DEBUG os_vif [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.596 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.596 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.596 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.599 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9021a9ab-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.599 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9021a9ab-13, col_values=(('external_ids', {'iface-id': '9021a9ab-13d1-4709-b167-1971ec480bf2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:1d:49', 'vm-uuid': 'd7ac9bd2-23a9-4add-b755-5a26fdfe7862'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.601 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:09 np0005601977 NetworkManager[55565]: <info>  [1769765469.6020] manager: (tap9021a9ab-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.603 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.606 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.608 183134 INFO os_vif [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13')#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.664 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.664 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.664 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:17:1d:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.665 183134 INFO nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Using config drive#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.988 183134 INFO nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Creating config drive at /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.config#033[00m
Jan 30 04:31:09 np0005601977 nova_compute[183130]: 2026-01-30 09:31:09.992 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0pr8hdq3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.116 183134 DEBUG oslo_concurrency.processutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0pr8hdq3" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:10 np0005601977 kernel: tap9021a9ab-13: entered promiscuous mode
Jan 30 04:31:10 np0005601977 NetworkManager[55565]: <info>  [1769765470.1742] manager: (tap9021a9ab-13): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 30 04:31:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:10Z|00205|binding|INFO|Claiming lport 9021a9ab-13d1-4709-b167-1971ec480bf2 for this chassis.
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.175 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:10Z|00206|binding|INFO|9021a9ab-13d1-4709-b167-1971ec480bf2: Claiming fa:16:3e:17:1d:49 10.100.0.6
Jan 30 04:31:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:10Z|00207|binding|INFO|Setting lport 9021a9ab-13d1-4709-b167-1971ec480bf2 ovn-installed in OVS
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.183 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.185 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:10 np0005601977 systemd-udevd[217487]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:31:10 np0005601977 systemd-machined[154431]: New machine qemu-16-instance-00000019.
Jan 30 04:31:10 np0005601977 NetworkManager[55565]: <info>  [1769765470.2106] device (tap9021a9ab-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:31:10 np0005601977 NetworkManager[55565]: <info>  [1769765470.2111] device (tap9021a9ab-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:31:10 np0005601977 systemd[1]: Started Virtual Machine qemu-16-instance-00000019.
Jan 30 04:31:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:10Z|00208|binding|INFO|Setting lport 9021a9ab-13d1-4709-b167-1971ec480bf2 up in Southbound
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.228 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:1d:49 10.100.0.6'], port_security=['fa:16:3e:17:1d:49 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd7ac9bd2-23a9-4add-b755-5a26fdfe7862', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf07673-f284-4722-a20b-66fb3bba1a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3506ce85-9def-48fe-941a-2eb4d865ee70', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7912079a-8212-4607-b4d0-2d35aabace59, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=9021a9ab-13d1-4709-b167-1971ec480bf2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.230 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 9021a9ab-13d1-4709-b167-1971ec480bf2 in datapath 4bf07673-f284-4722-a20b-66fb3bba1a03 bound to our chassis#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.231 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bf07673-f284-4722-a20b-66fb3bba1a03#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.242 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd8a413-1adc-4f46-a25e-88c91b148883]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.261 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c6809282-e41c-48a1-b5a1-7dfcf6d89465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.264 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[92db5680-63f2-4c21-9ff7-8af10f55148a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.282 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[831b4133-e9b5-4773-b6d7-7d530cbf8508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.294 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9df24313-b5c9-410c-91ab-e45f189d2db3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf07673-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:9d:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396336, 'reachable_time': 36063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217502, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.304 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a637fb0b-697a-4fd1-99a4-7c8c34cf4b58]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4bf07673-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396347, 'tstamp': 396347}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217503, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4bf07673-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396349, 'tstamp': 396349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217503, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.306 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf07673-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.307 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.308 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bf07673-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.309 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.309 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bf07673-f0, col_values=(('external_ids', {'iface-id': '7e936c7b-5f33-4a17-8358-59044e61e6d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:10.309 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.423 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.854 183134 DEBUG nova.network.neutron [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updated VIF entry in instance network info cache for port 9021a9ab-13d1-4709-b167-1971ec480bf2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.855 183134 DEBUG nova.network.neutron [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updating instance_info_cache with network_info: [{"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.874 183134 DEBUG oslo_concurrency.lockutils [req-7509106f-4b25-4b39-ba93-ae2f7e66ab1b req-1fefea8b-ce52-452c-bb43-be89d767461c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.992 183134 DEBUG nova.compute.manager [req-dc03b0da-8e55-43cc-939d-23a3cbe82e49 req-ac463c23-3a57-4e90-bfaf-9048d9de3703 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.993 183134 DEBUG oslo_concurrency.lockutils [req-dc03b0da-8e55-43cc-939d-23a3cbe82e49 req-ac463c23-3a57-4e90-bfaf-9048d9de3703 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.993 183134 DEBUG oslo_concurrency.lockutils [req-dc03b0da-8e55-43cc-939d-23a3cbe82e49 req-ac463c23-3a57-4e90-bfaf-9048d9de3703 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.993 183134 DEBUG oslo_concurrency.lockutils [req-dc03b0da-8e55-43cc-939d-23a3cbe82e49 req-ac463c23-3a57-4e90-bfaf-9048d9de3703 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:10 np0005601977 nova_compute[183130]: 2026-01-30 09:31:10.994 183134 DEBUG nova.compute.manager [req-dc03b0da-8e55-43cc-939d-23a3cbe82e49 req-ac463c23-3a57-4e90-bfaf-9048d9de3703 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Processing event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.271 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.272 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765471.2707226, d7ac9bd2-23a9-4add-b755-5a26fdfe7862 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.272 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] VM Started (Lifecycle Event)#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.275 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.278 183134 INFO nova.virt.libvirt.driver [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Instance spawned successfully.#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.279 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.323 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.325 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.326 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.326 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.327 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.327 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.328 183134 DEBUG nova.virt.libvirt.driver [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.335 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.368 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.369 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765471.2708905, d7ac9bd2-23a9-4add-b755-5a26fdfe7862 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.369 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.397 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.401 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765471.2743657, d7ac9bd2-23a9-4add-b755-5a26fdfe7862 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.401 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.406 183134 INFO nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Took 6.38 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.406 183134 DEBUG nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.418 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.424 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.469 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.483 183134 INFO nova.compute.manager [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Took 7.10 seconds to build instance.#033[00m
Jan 30 04:31:11 np0005601977 nova_compute[183130]: 2026-01-30 09:31:11.499 183134 DEBUG oslo_concurrency.lockutils [None req-d9a4df4e-1e9c-4b70-a56d-c55da7dfc390 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:12 np0005601977 nova_compute[183130]: 2026-01-30 09:31:12.469 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.090 183134 DEBUG nova.compute.manager [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.090 183134 DEBUG oslo_concurrency.lockutils [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.091 183134 DEBUG oslo_concurrency.lockutils [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.091 183134 DEBUG oslo_concurrency.lockutils [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.091 183134 DEBUG nova.compute.manager [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] No waiting events found dispatching network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:13 np0005601977 nova_compute[183130]: 2026-01-30 09:31:13.091 183134 WARNING nova.compute.manager [req-447ab321-efda-467a-9079-2fd40fa4930b req-5f4adafa-5e91-4ad8-924a-5b9e87e15dc8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received unexpected event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:31:14 np0005601977 nova_compute[183130]: 2026-01-30 09:31:14.618 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:14 np0005601977 podman[217512]: 2026-01-30 09:31:14.857153631 +0000 UTC m=+0.057635738 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:31:14 np0005601977 podman[217511]: 2026-01-30 09:31:14.858623523 +0000 UTC m=+0.059559033 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.299 183134 DEBUG nova.compute.manager [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-changed-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.300 183134 DEBUG nova.compute.manager [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Refreshing instance network info cache due to event network-changed-9021a9ab-13d1-4709-b167-1971ec480bf2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.300 183134 DEBUG oslo_concurrency.lockutils [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.300 183134 DEBUG oslo_concurrency.lockutils [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.300 183134 DEBUG nova.network.neutron [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Refreshing network info cache for port 9021a9ab-13d1-4709-b167-1971ec480bf2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:31:15 np0005601977 nova_compute[183130]: 2026-01-30 09:31:15.425 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:17 np0005601977 nova_compute[183130]: 2026-01-30 09:31:17.384 183134 DEBUG nova.network.neutron [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updated VIF entry in instance network info cache for port 9021a9ab-13d1-4709-b167-1971ec480bf2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:31:17 np0005601977 nova_compute[183130]: 2026-01-30 09:31:17.385 183134 DEBUG nova.network.neutron [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updating instance_info_cache with network_info: [{"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:17 np0005601977 nova_compute[183130]: 2026-01-30 09:31:17.413 183134 DEBUG oslo_concurrency.lockutils [req-5471eaac-f4d1-4d71-aeb9-1cd63376d2d0 req-587970b0-1ec0-4656-b384-6ceda5e396c4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-d7ac9bd2-23a9-4add-b755-5a26fdfe7862" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:19 np0005601977 nova_compute[183130]: 2026-01-30 09:31:19.622 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:20 np0005601977 nova_compute[183130]: 2026-01-30 09:31:20.428 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:22 np0005601977 podman[217558]: 2026-01-30 09:31:22.825260352 +0000 UTC m=+0.039951333 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:31:22 np0005601977 podman[217557]: 2026-01-30 09:31:22.859911562 +0000 UTC m=+0.075453517 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:31:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:24Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:1d:49 10.100.0.6
Jan 30 04:31:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:24Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:1d:49 10.100.0.6
Jan 30 04:31:24 np0005601977 nova_compute[183130]: 2026-01-30 09:31:24.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:25 np0005601977 nova_compute[183130]: 2026-01-30 09:31:25.431 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:27 np0005601977 podman[217615]: 2026-01-30 09:31:27.871919793 +0000 UTC m=+0.086861812 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:31:29 np0005601977 nova_compute[183130]: 2026-01-30 09:31:29.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:30 np0005601977 nova_compute[183130]: 2026-01-30 09:31:30.435 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.557 183134 INFO nova.compute.manager [None req-8fd29671-bffa-4e68-9205-e692f8519be1 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Get console output#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.565 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.984 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.984 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.985 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.985 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.985 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.986 183134 INFO nova.compute.manager [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Terminating instance#033[00m
Jan 30 04:31:31 np0005601977 nova_compute[183130]: 2026-01-30 09:31:31.987 183134 DEBUG nova.compute.manager [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:31:32 np0005601977 kernel: tap9021a9ab-13 (unregistering): left promiscuous mode
Jan 30 04:31:32 np0005601977 NetworkManager[55565]: <info>  [1769765492.0122] device (tap9021a9ab-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:31:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:32Z|00209|binding|INFO|Releasing lport 9021a9ab-13d1-4709-b167-1971ec480bf2 from this chassis (sb_readonly=0)
Jan 30 04:31:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:32Z|00210|binding|INFO|Setting lport 9021a9ab-13d1-4709-b167-1971ec480bf2 down in Southbound
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.017 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:32Z|00211|binding|INFO|Removing iface tap9021a9ab-13 ovn-installed in OVS
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.020 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.024 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.028 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:1d:49 10.100.0.6'], port_security=['fa:16:3e:17:1d:49 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd7ac9bd2-23a9-4add-b755-5a26fdfe7862', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf07673-f284-4722-a20b-66fb3bba1a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3506ce85-9def-48fe-941a-2eb4d865ee70', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7912079a-8212-4607-b4d0-2d35aabace59, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=9021a9ab-13d1-4709-b167-1971ec480bf2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.029 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 9021a9ab-13d1-4709-b167-1971ec480bf2 in datapath 4bf07673-f284-4722-a20b-66fb3bba1a03 unbound from our chassis#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.031 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bf07673-f284-4722-a20b-66fb3bba1a03#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.041 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6a21a809-e1a4-432f-8c35-94375bb1a011]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 30 04:31:32 np0005601977 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Consumed 12.903s CPU time.
Jan 30 04:31:32 np0005601977 systemd-machined[154431]: Machine qemu-16-instance-00000019 terminated.
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.064 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[fa518038-9f66-4dc7-bce4-69a8f14af9f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.066 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[93806b8e-6080-47d1-9d52-c9f4e024359b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.086 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b90f56ae-2011-41ae-9e25-38891c1bb2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.101 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ac767601-eeba-4a14-9831-ef6826d0da96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf07673-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:9d:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396336, 'reachable_time': 36063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217654, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.115 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9851c8be-6fae-4e2c-9adb-fc62ae6fff0d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4bf07673-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396347, 'tstamp': 396347}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217655, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4bf07673-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396349, 'tstamp': 396349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217655, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.116 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf07673-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.117 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.120 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bf07673-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.120 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.120 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.121 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bf07673-f0, col_values=(('external_ids', {'iface-id': '7e936c7b-5f33-4a17-8358-59044e61e6d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:32.121 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.249 183134 INFO nova.virt.libvirt.driver [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Instance destroyed successfully.#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.251 183134 DEBUG nova.objects.instance [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid d7ac9bd2-23a9-4add-b755-5a26fdfe7862 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.267 183134 DEBUG nova.virt.libvirt.vif [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-286632656',display_name='tempest-TestNetworkBasicOps-server-286632656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-286632656',id=25,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEM2OHCbWaK/MnKdQvTKptgEKy/m0p5KRD77VXmCkl08WjI8yjevXTAPFufS0hp9fBZ0qOW+G5AJXmRQITOCr5T10zMe2ByXVJ/U8WPNENkNQLd7a7k17NnvLLM/4N5GQ==',key_name='tempest-TestNetworkBasicOps-1194918393',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:31:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-khti0vix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:31:11Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=d7ac9bd2-23a9-4add-b755-5a26fdfe7862,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.269 183134 DEBUG nova.network.os_vif_util [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "9021a9ab-13d1-4709-b167-1971ec480bf2", "address": "fa:16:3e:17:1d:49", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9021a9ab-13", "ovs_interfaceid": "9021a9ab-13d1-4709-b167-1971ec480bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.270 183134 DEBUG nova.network.os_vif_util [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.270 183134 DEBUG os_vif [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.273 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.273 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9021a9ab-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.277 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.281 183134 INFO os_vif [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:1d:49,bridge_name='br-int',has_traffic_filtering=True,id=9021a9ab-13d1-4709-b167-1971ec480bf2,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9021a9ab-13')#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.281 183134 INFO nova.virt.libvirt.driver [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Deleting instance files /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862_del#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.282 183134 INFO nova.virt.libvirt.driver [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Deletion of /var/lib/nova/instances/d7ac9bd2-23a9-4add-b755-5a26fdfe7862_del complete#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.354 183134 INFO nova.compute.manager [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.354 183134 DEBUG oslo.service.loopingcall [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.355 183134 DEBUG nova.compute.manager [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:31:32 np0005601977 nova_compute[183130]: 2026-01-30 09:31:32.355 183134 DEBUG nova.network.neutron [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.013 183134 DEBUG nova.compute.manager [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-unplugged-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.014 183134 DEBUG oslo_concurrency.lockutils [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.014 183134 DEBUG oslo_concurrency.lockutils [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.014 183134 DEBUG oslo_concurrency.lockutils [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.015 183134 DEBUG nova.compute.manager [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] No waiting events found dispatching network-vif-unplugged-9021a9ab-13d1-4709-b167-1971ec480bf2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.015 183134 DEBUG nova.compute.manager [req-93f510d9-4cc5-4e77-8773-aa717af5405f req-e8ded557-7e9e-447d-b6ee-23508a476dc1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-unplugged-9021a9ab-13d1-4709-b167-1971ec480bf2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.561 183134 DEBUG nova.network.neutron [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.610 183134 INFO nova.compute.manager [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.670 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.671 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.784 183134 DEBUG nova.compute.provider_tree [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.800 183134 DEBUG nova.scheduler.client.report [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.819 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.850 183134 INFO nova.scheduler.client.report [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance d7ac9bd2-23a9-4add-b755-5a26fdfe7862#033[00m
Jan 30 04:31:33 np0005601977 nova_compute[183130]: 2026-01-30 09:31:33.937 183134 DEBUG oslo_concurrency.lockutils [None req-27a3d8d1-d3c5-4819-aaa0-346aa24f55de a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.153 183134 DEBUG nova.compute.manager [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.154 183134 DEBUG oslo_concurrency.lockutils [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.154 183134 DEBUG oslo_concurrency.lockutils [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.154 183134 DEBUG oslo_concurrency.lockutils [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "d7ac9bd2-23a9-4add-b755-5a26fdfe7862-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.154 183134 DEBUG nova.compute.manager [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] No waiting events found dispatching network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.155 183134 WARNING nova.compute.manager [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received unexpected event network-vif-plugged-9021a9ab-13d1-4709-b167-1971ec480bf2 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.155 183134 DEBUG nova.compute.manager [req-73dd020a-70f0-4817-ab35-c266a0b8b0fa req-142e72b5-c6e2-4c40-953e-9b69a813d19a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Received event network-vif-deleted-9021a9ab-13d1-4709-b167-1971ec480bf2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.478 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.804 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.804 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.805 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.805 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.805 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.806 183134 INFO nova.compute.manager [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Terminating instance#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.807 183134 DEBUG nova.compute.manager [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:31:35 np0005601977 kernel: tap24438658-53 (unregistering): left promiscuous mode
Jan 30 04:31:35 np0005601977 NetworkManager[55565]: <info>  [1769765495.8355] device (tap24438658-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:31:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:35Z|00212|binding|INFO|Releasing lport 24438658-5388-4fa2-a1bb-4a7cc225f3ca from this chassis (sb_readonly=0)
Jan 30 04:31:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:35Z|00213|binding|INFO|Setting lport 24438658-5388-4fa2-a1bb-4a7cc225f3ca down in Southbound
Jan 30 04:31:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:35Z|00214|binding|INFO|Removing iface tap24438658-53 ovn-installed in OVS
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.838 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.841 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:35.848 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:b0:a8 10.100.0.10'], port_security=['fa:16:3e:09:b0:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '510e72ed-ac04-4a15-babc-98d067a699fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf07673-f284-4722-a20b-66fb3bba1a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0bd8cd5-4cca-4fa1-8f73-177b23e189a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7912079a-8212-4607-b4d0-2d35aabace59, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=24438658-5388-4fa2-a1bb-4a7cc225f3ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:35 np0005601977 nova_compute[183130]: 2026-01-30 09:31:35.848 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:35.850 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 24438658-5388-4fa2-a1bb-4a7cc225f3ca in datapath 4bf07673-f284-4722-a20b-66fb3bba1a03 unbound from our chassis#033[00m
Jan 30 04:31:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:35.852 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bf07673-f284-4722-a20b-66fb3bba1a03, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:31:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:35.853 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2b53dc-8d92-4926-9f97-668a1a5824d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:35.854 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03 namespace which is not needed anymore#033[00m
Jan 30 04:31:35 np0005601977 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 30 04:31:35 np0005601977 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Consumed 13.885s CPU time.
Jan 30 04:31:35 np0005601977 systemd-machined[154431]: Machine qemu-15-instance-00000017 terminated.
Jan 30 04:31:35 np0005601977 podman[217678]: 2026-01-30 09:31:35.946026653 +0000 UTC m=+0.075753292 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [NOTICE]   (217156) : haproxy version is 2.8.14-c23fe91
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [NOTICE]   (217156) : path to executable is /usr/sbin/haproxy
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [WARNING]  (217156) : Exiting Master process...
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [WARNING]  (217156) : Exiting Master process...
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [ALERT]    (217156) : Current worker (217158) exited with code 143 (Terminated)
Jan 30 04:31:35 np0005601977 neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03[217150]: [WARNING]  (217156) : All workers exited. Exiting... (0)
Jan 30 04:31:35 np0005601977 systemd[1]: libpod-1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86.scope: Deactivated successfully.
Jan 30 04:31:35 np0005601977 conmon[217150]: conmon 1cb33e897bc9856fa77d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86.scope/container/memory.events
Jan 30 04:31:35 np0005601977 podman[217723]: 2026-01-30 09:31:35.964615748 +0000 UTC m=+0.043312611 container died 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:31:35 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86-userdata-shm.mount: Deactivated successfully.
Jan 30 04:31:35 np0005601977 systemd[1]: var-lib-containers-storage-overlay-337789b96227970c4256f75367d45c3a773007aaf88b458349f5a12b261a253f-merged.mount: Deactivated successfully.
Jan 30 04:31:36 np0005601977 podman[217723]: 2026-01-30 09:31:36.002708284 +0000 UTC m=+0.081405147 container cleanup 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:31:36 np0005601977 systemd[1]: libpod-conmon-1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86.scope: Deactivated successfully.
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.046 183134 INFO nova.virt.libvirt.driver [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Instance destroyed successfully.#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.046 183134 DEBUG nova.objects.instance [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 510e72ed-ac04-4a15-babc-98d067a699fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:36 np0005601977 podman[217754]: 2026-01-30 09:31:36.048788835 +0000 UTC m=+0.033914965 container remove 1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.051 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[93583840-f557-4568-9855-59a1e67cb90d]: (4, ('Fri Jan 30 09:31:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03 (1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86)\n1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86\nFri Jan 30 09:31:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03 (1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86)\n1cb33e897bc9856fa77d8ad050721eb7c265caba1b14144b22aaaf2a68aabf86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.053 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[effa799b-805d-48d9-86cc-155dfcd727bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.053 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf07673-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.055 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:36 np0005601977 kernel: tap4bf07673-f0: left promiscuous mode
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.061 183134 DEBUG nova.virt.libvirt.vif [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:30:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1756779848',display_name='tempest-TestNetworkBasicOps-server-1756779848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1756779848',id=23,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7nBvfqfwGy2liDEg9UFi8jz2EW5wv6a3dNMz1Fig7uIT7wmLwG0gyZV3tPLN/mXtBT4KR4PDkroFmb3OmmVcTwxuN/C1qocWLf9C43upOGgL+ReeZ+dQ/I6+UAhTowqw==',key_name='tempest-TestNetworkBasicOps-828198191',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:30:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-b0z09rz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:30:36Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=510e72ed-ac04-4a15-babc-98d067a699fa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.061 183134 DEBUG nova.network.os_vif_util [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "address": "fa:16:3e:09:b0:a8", "network": {"id": "4bf07673-f284-4722-a20b-66fb3bba1a03", "bridge": "br-int", "label": "tempest-network-smoke--1326577558", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24438658-53", "ovs_interfaceid": "24438658-5388-4fa2-a1bb-4a7cc225f3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.062 183134 DEBUG nova.network.os_vif_util [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.062 183134 DEBUG os_vif [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.063 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.064 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24438658-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.064 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c491a32c-08a2-4464-9e4f-c0622e41cf47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.065 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.065 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.070 183134 INFO os_vif [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:b0:a8,bridge_name='br-int',has_traffic_filtering=True,id=24438658-5388-4fa2-a1bb-4a7cc225f3ca,network=Network(4bf07673-f284-4722-a20b-66fb3bba1a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24438658-53')#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.070 183134 INFO nova.virt.libvirt.driver [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Deleting instance files /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa_del#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.071 183134 INFO nova.virt.libvirt.driver [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Deletion of /var/lib/nova/instances/510e72ed-ac04-4a15-babc-98d067a699fa_del complete#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.074 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e52eaab0-f3ed-45c4-8ad9-b6959b123fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.075 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c150d0ab-8693-483f-b27e-c66d200d635d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.083 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb7163e-91c7-4944-b4db-b9f593d7fb15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396330, 'reachable_time': 37439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217788, 'error': None, 'target': 'ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.085 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4bf07673-f284-4722-a20b-66fb3bba1a03 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:31:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:36.085 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[9a74df03-fada-4cc7-8ff6-6e2760c589a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:36 np0005601977 systemd[1]: run-netns-ovnmeta\x2d4bf07673\x2df284\x2d4722\x2da20b\x2d66fb3bba1a03.mount: Deactivated successfully.
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.134 183134 INFO nova.compute.manager [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.135 183134 DEBUG oslo.service.loopingcall [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.135 183134 DEBUG nova.compute.manager [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.135 183134 DEBUG nova.network.neutron [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.570 183134 DEBUG nova.compute.manager [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-unplugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.571 183134 DEBUG oslo_concurrency.lockutils [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.572 183134 DEBUG oslo_concurrency.lockutils [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.572 183134 DEBUG oslo_concurrency.lockutils [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.573 183134 DEBUG nova.compute.manager [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] No waiting events found dispatching network-vif-unplugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:36 np0005601977 nova_compute[183130]: 2026-01-30 09:31:36.573 183134 DEBUG nova.compute.manager [req-3d0d4cb6-64f6-4067-8f5f-67d3f762dd76 req-1b9e60d2-272d-4f26-80bb-edc31b284788 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-unplugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.023 183134 DEBUG nova.network.neutron [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.040 183134 INFO nova.compute.manager [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Took 0.90 seconds to deallocate network for instance.#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.094 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.094 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.204 183134 DEBUG nova.compute.provider_tree [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.223 183134 DEBUG nova.scheduler.client.report [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.243 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.270 183134 INFO nova.scheduler.client.report [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 510e72ed-ac04-4a15-babc-98d067a699fa#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.374 183134 DEBUG oslo_concurrency.lockutils [None req-9ad2d44d-5618-4fc0-89f4-4fa524185def a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:37 np0005601977 nova_compute[183130]: 2026-01-30 09:31:37.381 183134 DEBUG nova.compute.manager [req-f4a3846d-0ad5-464f-b7ea-15757258f19f req-d6da0b7b-56a3-4622-8083-7023792b0fe4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-deleted-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.932 183134 DEBUG nova.compute.manager [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.933 183134 DEBUG oslo_concurrency.lockutils [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.933 183134 DEBUG oslo_concurrency.lockutils [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.933 183134 DEBUG oslo_concurrency.lockutils [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "510e72ed-ac04-4a15-babc-98d067a699fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.934 183134 DEBUG nova.compute.manager [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] No waiting events found dispatching network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:38 np0005601977 nova_compute[183130]: 2026-01-30 09:31:38.934 183134 WARNING nova.compute.manager [req-b95753a3-c39b-4bb3-bb99-bbacd8ffb394 req-18f0d396-3475-4af4-a2fd-9fb4008bf76b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Received unexpected event network-vif-plugged-24438658-5388-4fa2-a1bb-4a7cc225f3ca for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:31:40 np0005601977 nova_compute[183130]: 2026-01-30 09:31:40.480 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:40Z|00215|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:31:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:40Z|00216|binding|INFO|Releasing lport 15b4d9a6-bad1-4bf8-a262-02e27eb8ea93 from this chassis (sb_readonly=0)
Jan 30 04:31:40 np0005601977 nova_compute[183130]: 2026-01-30 09:31:40.737 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:41 np0005601977 nova_compute[183130]: 2026-01-30 09:31:41.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:43.066 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:43 np0005601977 nova_compute[183130]: 2026-01-30 09:31:43.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:43.067 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:31:43 np0005601977 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 30 04:31:44 np0005601977 nova_compute[183130]: 2026-01-30 09:31:44.043 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:45 np0005601977 nova_compute[183130]: 2026-01-30 09:31:45.519 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:45 np0005601977 podman[217790]: 2026-01-30 09:31:45.873043285 +0000 UTC m=+0.075532195 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, release=1769056855, distribution-scope=public, name=ubi9/ubi-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:31:45 np0005601977 podman[217791]: 2026-01-30 09:31:45.900041537 +0000 UTC m=+0.098923562 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.590 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.591 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.632 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.693 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.694 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.724 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.755 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.755 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.761 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.761 183134 INFO nova.compute.claims [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:31:46 np0005601977 nova_compute[183130]: 2026-01-30 09:31:46.811 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.001 183134 DEBUG nova.compute.provider_tree [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.023 183134 DEBUG nova.scheduler.client.report [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.057 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.057 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.060 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.067 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.067 183134 INFO nova.compute.claims [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.123 183134 INFO nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.125 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.125 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.153 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.246 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765492.245273, d7ac9bd2-23a9-4add-b755-5a26fdfe7862 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.247 183134 INFO nova.compute.manager [-] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.277 183134 DEBUG nova.compute.manager [None req-648f1dc2-cbf6-4a06-91e4-0608ad720edc - - - - - -] [instance: d7ac9bd2-23a9-4add-b755-5a26fdfe7862] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.288 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.289 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.290 183134 INFO nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Creating image(s)#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.290 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.291 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.292 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.310 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.332 183134 DEBUG nova.compute.provider_tree [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.354 183134 DEBUG nova.scheduler.client.report [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.361 183134 DEBUG nova.policy [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.384 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.384 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.389 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.390 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.391 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.407 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.433 183134 INFO nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.436 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.436 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.454 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.466 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.466 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.492 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.493 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.494 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.539 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.540 183134 DEBUG nova.virt.disk.api [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.540 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.556 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.558 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.558 183134 INFO nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Creating image(s)#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.558 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.559 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.559 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.571 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.611 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.612 183134 DEBUG nova.virt.disk.api [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.613 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.613 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Ensure instance console log exists: /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.613 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.614 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.614 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.615 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.616 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.616 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.628 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.697 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.698 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.724 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.724 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.725 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.770 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.771 183134 DEBUG nova.virt.disk.api [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.771 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.786 183134 DEBUG nova.policy [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.817 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.817 183134 DEBUG nova.virt.disk.api [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.818 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.818 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Ensure instance console log exists: /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.818 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.819 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:47 np0005601977 nova_compute[183130]: 2026-01-30 09:31:47.819 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:48 np0005601977 nova_compute[183130]: 2026-01-30 09:31:48.485 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Successfully created port: 1c1b6dde-b8fc-4af2-9a67-11240761a805 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:31:49 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:49.069 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:49 np0005601977 nova_compute[183130]: 2026-01-30 09:31:49.602 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Successfully created port: 48d259d8-4396-4640-8f63-b475acc34639 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:31:49 np0005601977 nova_compute[183130]: 2026-01-30 09:31:49.627 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:50 np0005601977 nova_compute[183130]: 2026-01-30 09:31:50.521 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.045 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765496.044022, 510e72ed-ac04-4a15-babc-98d067a699fa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.046 183134 INFO nova.compute.manager [-] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.078 183134 DEBUG nova.compute.manager [None req-333565cd-e4d7-4f43-909b-89c986b79bee - - - - - -] [instance: 510e72ed-ac04-4a15-babc-98d067a699fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.809 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "7a073e24-c800-4962-af5e-ff5400800f34" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.810 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.810 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "7a073e24-c800-4962-af5e-ff5400800f34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.810 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.810 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.812 183134 INFO nova.compute.manager [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Terminating instance#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.813 183134 DEBUG nova.compute.manager [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:31:51 np0005601977 kernel: tapfb902761-f0 (unregistering): left promiscuous mode
Jan 30 04:31:51 np0005601977 NetworkManager[55565]: <info>  [1769765511.8456] device (tapfb902761-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.852 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:51Z|00217|binding|INFO|Releasing lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e from this chassis (sb_readonly=0)
Jan 30 04:31:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:51Z|00218|binding|INFO|Setting lport fb902761-f001-4e8a-9c56-1bdc4fb6a88e down in Southbound
Jan 30 04:31:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:51Z|00219|binding|INFO|Removing iface tapfb902761-f0 ovn-installed in OVS
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.855 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:51.862 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:52:dd 10.100.0.3'], port_security=['fa:16:3e:9b:52:dd 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7a073e24-c800-4962-af5e-ff5400800f34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58c1f09b90b6436c9e7154cd88c1ba5f', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7061d6e3-fadd-4588-92c3-9c8afe539ede', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76c186c3-e40e-4db5-b50c-3686091722f9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=fb902761-f001-4e8a-9c56-1bdc4fb6a88e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:51 np0005601977 nova_compute[183130]: 2026-01-30 09:31:51.863 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:51.865 104706 INFO neutron.agent.ovn.metadata.agent [-] Port fb902761-f001-4e8a-9c56-1bdc4fb6a88e in datapath 8e0e3ea2-5897-4c05-8f15-ccf8330993c7 unbound from our chassis#033[00m
Jan 30 04:31:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:51.868 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e0e3ea2-5897-4c05-8f15-ccf8330993c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:31:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:51.870 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc04e6a-91b6-4515-9d79-e1f56bcdfb48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:51.871 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7 namespace which is not needed anymore#033[00m
Jan 30 04:31:51 np0005601977 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 30 04:31:51 np0005601977 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 15.981s CPU time.
Jan 30 04:31:51 np0005601977 systemd-machined[154431]: Machine qemu-3-instance-00000006 terminated.
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [NOTICE]   (213306) : haproxy version is 2.8.14-c23fe91
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [NOTICE]   (213306) : path to executable is /usr/sbin/haproxy
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [WARNING]  (213306) : Exiting Master process...
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [WARNING]  (213306) : Exiting Master process...
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [ALERT]    (213306) : Current worker (213308) exited with code 143 (Terminated)
Jan 30 04:31:51 np0005601977 neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7[213302]: [WARNING]  (213306) : All workers exited. Exiting... (0)
Jan 30 04:31:51 np0005601977 systemd[1]: libpod-8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b.scope: Deactivated successfully.
Jan 30 04:31:51 np0005601977 podman[217884]: 2026-01-30 09:31:51.986463951 +0000 UTC m=+0.042228459 container died 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:31:52 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b-userdata-shm.mount: Deactivated successfully.
Jan 30 04:31:52 np0005601977 systemd[1]: var-lib-containers-storage-overlay-268efb99abd6e31584aa827875d86e9990f9ff6e50f730d802c5bdf2c3353f2b-merged.mount: Deactivated successfully.
Jan 30 04:31:52 np0005601977 podman[217884]: 2026-01-30 09:31:52.017313765 +0000 UTC m=+0.073078273 container cleanup 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:31:52 np0005601977 systemd[1]: libpod-conmon-8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b.scope: Deactivated successfully.
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.066 183134 INFO nova.virt.libvirt.driver [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Instance destroyed successfully.#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.067 183134 DEBUG nova.objects.instance [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lazy-loading 'resources' on Instance uuid 7a073e24-c800-4962-af5e-ff5400800f34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:52 np0005601977 podman[217916]: 2026-01-30 09:31:52.081236869 +0000 UTC m=+0.048888954 container remove 8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.085 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7f10a741-2eda-4072-b937-851aa461bf0a]: (4, ('Fri Jan 30 09:31:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7 (8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b)\n8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b\nFri Jan 30 09:31:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7 (8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b)\n8338b08263d6beb503a68cbedc82b9695ce573cb00aa162052761144b5a9736b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.087 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c50e22ac-3ca4-4e34-87c7-af68fd48f626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.088 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e3ea2-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.090 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:52 np0005601977 kernel: tap8e0e3ea2-50: left promiscuous mode
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.097 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.100 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1dca79b5-1a0a-4dc9-98b3-9e05f1d80975]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.119 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[907870b8-b87d-497b-88f6-24feddf9ed47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.120 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[364b61ab-6341-415e-922c-3029af63dfed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.135 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f84237-a535-4dda-aa84-a82450f6d954]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366978, 'reachable_time': 18774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217951, 'error': None, 'target': 'ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.137 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e0e3ea2-5897-4c05-8f15-ccf8330993c7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:31:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:52.137 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c323fe-2137-4b74-bf6b-773717cadbe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:52 np0005601977 systemd[1]: run-netns-ovnmeta\x2d8e0e3ea2\x2d5897\x2d4c05\x2d8f15\x2dccf8330993c7.mount: Deactivated successfully.
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.141 183134 DEBUG nova.virt.libvirt.vif [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:23:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1403442336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1403442336',id=6,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:24:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58c1f09b90b6436c9e7154cd88c1ba5f',ramdisk_id='',reservation_id='r-50if40mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1955884209',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1955884209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:25:45Z,user_data=None,user_id='3fd4ee63e94e4c3b9a3e4cefa7e0f626',uuid=7a073e24-c800-4962-af5e-ff5400800f34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.141 183134 DEBUG nova.network.os_vif_util [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converting VIF {"id": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "address": "fa:16:3e:9b:52:dd", "network": {"id": "8e0e3ea2-5897-4c05-8f15-ccf8330993c7", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1938548603-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58c1f09b90b6436c9e7154cd88c1ba5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb902761-f0", "ovs_interfaceid": "fb902761-f001-4e8a-9c56-1bdc4fb6a88e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.142 183134 DEBUG nova.network.os_vif_util [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.142 183134 DEBUG os_vif [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.144 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.144 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb902761-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.146 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.148 183134 INFO os_vif [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=fb902761-f001-4e8a-9c56-1bdc4fb6a88e,network=Network(8e0e3ea2-5897-4c05-8f15-ccf8330993c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb902761-f0')#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.148 183134 INFO nova.virt.libvirt.driver [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Deleting instance files /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34_del#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.149 183134 INFO nova.virt.libvirt.driver [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Deletion of /var/lib/nova/instances/7a073e24-c800-4962-af5e-ff5400800f34_del complete#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.197 183134 INFO nova.compute.manager [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.197 183134 DEBUG oslo.service.loopingcall [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.198 183134 DEBUG nova.compute.manager [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.198 183134 DEBUG nova.network.neutron [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.306 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Successfully updated port: 1c1b6dde-b8fc-4af2-9a67-11240761a805 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.329 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.329 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.329 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:31:52 np0005601977 nova_compute[183130]: 2026-01-30 09:31:52.704 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.266 183134 DEBUG nova.network.neutron [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.285 183134 INFO nova.compute.manager [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Took 1.09 seconds to deallocate network for instance.#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.336 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.337 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.503 183134 DEBUG nova.compute.provider_tree [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.524 183134 DEBUG nova.scheduler.client.report [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.550 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.597 183134 INFO nova.scheduler.client.report [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Deleted allocations for instance 7a073e24-c800-4962-af5e-ff5400800f34#033[00m
Jan 30 04:31:53 np0005601977 nova_compute[183130]: 2026-01-30 09:31:53.685 183134 DEBUG oslo_concurrency.lockutils [None req-be8accf9-3461-4689-84eb-b812546940e8 3fd4ee63e94e4c3b9a3e4cefa7e0f626 58c1f09b90b6436c9e7154cd88c1ba5f - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:53 np0005601977 podman[217953]: 2026-01-30 09:31:53.83103598 +0000 UTC m=+0.046693980 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:31:53 np0005601977 podman[217952]: 2026-01-30 09:31:53.854054795 +0000 UTC m=+0.069832398 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.061 183134 DEBUG nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Received event network-vif-unplugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.062 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7a073e24-c800-4962-af5e-ff5400800f34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.062 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.062 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.062 183134 DEBUG nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] No waiting events found dispatching network-vif-unplugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.062 183134 WARNING nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Received unexpected event network-vif-unplugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 DEBUG nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Received event network-vif-plugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "7a073e24-c800-4962-af5e-ff5400800f34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 DEBUG oslo_concurrency.lockutils [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "7a073e24-c800-4962-af5e-ff5400800f34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 DEBUG nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] No waiting events found dispatching network-vif-plugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.063 183134 WARNING nova.compute.manager [req-7e68bc9e-9cb0-4ad5-99f4-94ce19ecb9c5 req-0c18d0ff-ab75-471e-bcff-64e9857940b6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Received unexpected event network-vif-plugged-fb902761-f001-4e8a-9c56-1bdc4fb6a88e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.147 183134 DEBUG nova.compute.manager [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.148 183134 DEBUG nova.compute.manager [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing instance network info cache due to event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.148 183134 DEBUG oslo_concurrency.lockutils [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.437 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Successfully updated port: 48d259d8-4396-4640-8f63-b475acc34639 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.474 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.474 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:54 np0005601977 nova_compute[183130]: 2026-01-30 09:31:54.474 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.452 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '396e2944b44f42e59b102db87e2e060c', 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'hostId': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.457 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56fc218b-8bfc-49db-9b43-3ba6a740c760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.453512', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '8459dcbe-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '78c2915d51e24f9292a7aea4c0ee97b66b2245ce0daf95266ea727ec766a1d11'}]}, 'timestamp': '2026-01-30 09:31:55.458550', '_unique_id': 'f5ece6e7cab04f2bbebc3eefa473eefb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.460 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.461 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.462 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cbd3e23-bec8-49fd-be05-70d72e3960b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.462411', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '845a94b0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': 'a975b5ae0c6e7a6448b979e6756261b80aebe2b880433bd6e7dbc0a7e489541a'}]}, 'timestamp': '2026-01-30 09:31:55.463139', '_unique_id': '7b5fd2c005c843069e0ff2b6dfca7ff2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.465 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56d092c4-ea62-478a-98ab-e915ff19fa8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.465829', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '845b1840-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '9c10bc57f4f385edcf444229913b3c2e52132472a5fcd22a72918e383eb656fd'}]}, 'timestamp': '2026-01-30 09:31:55.466422', '_unique_id': 'b758e3c08af1492da4de2a3a358216a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.498 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.requests volume: 329 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.498 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387c2232-8ce2-4b35-a320-34a06de53a4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 329, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.468729', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '84601124-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': 'e3072eb6543c3ad3fb137d31e1e41afb006490b3fe0d6c38b817fe4106ea23f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.468729', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '84602448-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '420022f3abe0ba94faa79b140ec30b24653629aca4820cedda26a133120e5ac3'}]}, 'timestamp': '2026-01-30 09:31:55.499435', '_unique_id': '2a108a9dfe5c4544a09801fb480847f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.513 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.514 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '561d91e1-a837-4157-8320-25f2059c77de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.501821', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '84626262-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': '4165eada6c812a13d24776e895dd2fdf9a91a2a52213bb36cf1398ca6f67bef3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.501821', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '846274dc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': '44b1b78111ff7e74fea7d52d3ff3d360b009bb1da4dc8e142749fe7c48668778'}]}, 'timestamp': '2026-01-30 09:31:55.514642', '_unique_id': '5bff2299bd494e0a9dcf3060cf2cb028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.515 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.517 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.517 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd64afb2c-1fec-471d-add9-59fdb63341ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.516979', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8462e584-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': 'cacdf562eadac91bc23d9f7a948cff7f4d022452027926b883e5255149134b10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.516979', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8462f998-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': 'ae1a2aa0ab89383fde1367e9b393de8e0066c4edcfa20b5500b513c0047188db'}]}, 'timestamp': '2026-01-30 09:31:55.518029', '_unique_id': '3a763aa7877a4a43af01daab9319bf3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.520 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70e0a6e7-da13-41b7-87be-c33c6ea9102f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 45, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.520524', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '84636e14-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '616d1961f228fd6978068e108080717a80571a7d9c36165b3ec142b4dc10e450'}]}, 'timestamp': '2026-01-30 09:31:55.521011', '_unique_id': '3e3c4217d86440388c7b69bdb46f209e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.523 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 nova_compute[183130]: 2026-01-30 09:31:55.523 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '037793c9-aa9e-4df7-af93-285ef9dee962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.523545', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '8463e524-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '736808312cf178394957315c55a95997ce30c24edf59bedffd3d94372c562f27'}]}, 'timestamp': '2026-01-30 09:31:55.524064', '_unique_id': '0136994fb1c4414ba08fc5088fc13f32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.526 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.bytes volume: 73072640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.526 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3f0bd8e-4924-46ed-9fdb-e046f3edcf7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73072640, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.526389', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '846455a4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': 'b2b574db09ca2f8659838b7a0d9bc1476cd7737a6998df8284c8e10a12157683'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.526389', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '84646af8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '36324ca698027f94f0a183e6f0b6f2f528937f3bd7fc98187a0d61040d6be832'}]}, 'timestamp': '2026-01-30 09:31:55.527548', '_unique_id': '725a976bb41c49a4a8e3c676699b7990'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.530 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.latency volume: 728377080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.530 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.latency volume: 53741683 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '031bce99-3859-4a88-bd31-cb1e5b8bc7bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 728377080, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.530102', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8464e56e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '424262cf35d2077eb9d4ed8a4ea811ef7a14e5752153403ee189584a46c1c1a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53741683, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.530102', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8464fa22-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': 'c2c11f14b3d2aedc9f10f860daf558fcbba57c618a59875c8c98efec2ae0d407'}]}, 'timestamp': '2026-01-30 09:31:55.531123', '_unique_id': '5101c1e1add14f5a9a966675e9c41d4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.533 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce7c6e16-84e5-479d-beff-c8514201ac6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.533753', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '84657538-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': 'b56fddacb0c2110009fc010cd4a394cc3f664a4f8ec3d4c097daa9468a975c40'}]}, 'timestamp': '2026-01-30 09:31:55.534376', '_unique_id': '1dc8cc3061a348ee83ebba6ae4cd9377'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.537 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.556 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/memory.usage volume: 46.24609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4873b932-f43e-4768-b80b-030f8e451c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.24609375, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'timestamp': '2026-01-30T09:31:55.537472', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8468ecf4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.951267426, 'message_signature': '68d74fee6b4ceb999d747244a07cc79c7261cc53c2476b88367266a89f8fc2d8'}]}, 'timestamp': '2026-01-30 09:31:55.556972', '_unique_id': 'afb3fc56579643f7b48503d612f03d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.558 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702668fe-ab36-4703-930e-8f00c810cc96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.558860', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '8469446a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': 'f4edfab9b605d614a1aa21c871f265fd65f6605bfca016d5247eaacb5e986524'}]}, 'timestamp': '2026-01-30 09:31:55.559203', '_unique_id': '9c95f14779b342f2ae8f18eeb9a01b5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.560 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.561 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d1402b6-8225-4fa6-ae0d-a7a622cb430b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.560709', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '84698d62-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': '7c198796f42f490fb2909ea1368d6bfb8589ccc7acb916d00f85bab40d598c08'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.560709', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8469996a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.896763398, 'message_signature': 'f1d59a9e84b7cafb187f22c4bd133ab510b170c8347ad5c8cb9fb9e74615207b'}]}, 'timestamp': '2026-01-30 09:31:55.561375', '_unique_id': 'e5aa7545d79546c4a13aa77bd5175dc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.563 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/cpu volume: 11830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9344027e-5f93-41b8-a1c4-8ab6faa563dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11830000000, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'timestamp': '2026-01-30T09:31:55.563044', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8469e802-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.951267426, 'message_signature': '82e086e7116861ecd1d56b68bd1af50f8a30dfd7c8b26b6d3194cde7578f4155'}]}, 'timestamp': '2026-01-30 09:31:55.563359', '_unique_id': '2e5dfc0dd2ef46e9bfdbdc5ce1e9a1d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.564 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.565 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '904c9d04-8c78-4683-bc15-db2d2b5857ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.564911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '846a3064-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': 'e5b508b846f91936e9f98ed3624bc734c68f8d57843b8bb1b422fe6092c2e9a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.564911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '846a3c1c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '70d6c52fdbaf8ee4d8dba0d055b5deab65718920e69689bf9f8d2702ee79a1fc'}]}, 'timestamp': '2026-01-30 09:31:55.565498', '_unique_id': '669548f019734cc4819a89fdd840c8b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.567 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0df7bc39-0e71-49a3-90ab-c9a6350a754f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.567065', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '846a856e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '6a3fdde91a015b3bc19473386931e7ab90e14bd3145469b86cca8913da3418b8'}]}, 'timestamp': '2026-01-30 09:31:55.567396', '_unique_id': '68423d3a5e7940a0949a915e56246dd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.568 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.569 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.bytes volume: 30276096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.569 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '431a7f60-59f3-4ef0-a3c4-632604519446', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30276096, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.569164', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '846ad7ee-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': 'd1ff1563d76041df3e761391409005b8bbd3a8efc27304ef0e57c789a1d7b762'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.569164', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '846ae266-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '4e7d62075d4ba10dbdf27c15fbf630b2ae93f4461058d2f3e716ccb2e14debbe'}]}, 'timestamp': '2026-01-30 09:31:55.569754', '_unique_id': '883db6d1c43e46e4affd300f9c97a48c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.571 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.571 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.latency volume: 2107401182 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.571 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc364cae-9f9b-4172-9169-aa018d324970', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2107401182, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-vda', 'timestamp': '2026-01-30T09:31:55.571247', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '846b27c6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '0b2019d0342e294e0ae9e0624655b707f58abfb65a7dd5712a5ddc08e602b0a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73-sda', 'timestamp': '2026-01-30T09:31:55.571247', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'instance-00000014', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '846b3234-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.863663598, 'message_signature': '68bc621f6f2eb8e976129aa051470a9f0991af1f2b5711b6c3abaef36a0429f3'}]}, 'timestamp': '2026-01-30 09:31:55.571796', '_unique_id': '6fe7382e9d7e4d5d832112b5be267e1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.573 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.573 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.incoming.bytes volume: 7284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a9fe477-edac-4a44-b534-fdb0e4b47acf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7284, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.573263', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '846b7690-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '99cf1bc89476eb96ec6d7e00f498a31f2fa002bf274ff7436f2c934aea70f3f6'}]}, 'timestamp': '2026-01-30 09:31:55.573567', '_unique_id': 'fe84efb4f5634a229105c85631134340'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.574 12 DEBUG ceilometer.compute.pollsters [-] 93629e5c-ca92-47ac-8567-35d85b4e2a73/network.outgoing.bytes volume: 5882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83168a81-4668-47c9-9952-337c25f84d1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5882, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000014-93629e5c-ca92-47ac-8567-35d85b4e2a73-tap695209cb-0d', 'timestamp': '2026-01-30T09:31:55.574944', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569', 'name': 'tap695209cb-0d', 'instance_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:ea:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap695209cb-0d'}, 'message_id': '846bb808-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4046.848446081, 'message_signature': '5ad149df584c584d775ef9171c82dae720c71b19ed522deaa715e486e424e7ed'}]}, 'timestamp': '2026-01-30 09:31:55.575292', '_unique_id': 'df2b53eca3c946fdb8fe6afb5b7542af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:31:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:31:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:31:55 np0005601977 nova_compute[183130]: 2026-01-30 09:31:55.908 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.067 183134 DEBUG nova.network.neutron [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.097 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.098 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance network_info: |[{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.098 183134 DEBUG oslo_concurrency.lockutils [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.098 183134 DEBUG nova.network.neutron [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.102 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Start _get_guest_xml network_info=[{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.106 183134 WARNING nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.111 183134 DEBUG nova.virt.libvirt.host [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.112 183134 DEBUG nova.virt.libvirt.host [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.117 183134 DEBUG nova.virt.libvirt.host [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.118 183134 DEBUG nova.virt.libvirt.host [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.119 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.119 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.120 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.120 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.121 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.121 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.121 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.121 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.122 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.122 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.122 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.123 183134 DEBUG nova.virt.hardware [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.126 183134 DEBUG nova.virt.libvirt.vif [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1575574141',display_name='tempest-TestNetworkAdvancedServerOps-server-1575574141',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1575574141',id=24,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7+DEG0j4DwrhGi6Li8X1HJY2IlLENpe8wHNWNUSYf6uplctgauCp7ClsFJ4rfPZ7qGthxQvxsuwAx1dpIqacU6XqzNEDTVhJLa2bDmNGPQnx6wh817TnRHA/3QIu2i5w==',key_name='tempest-TestNetworkAdvancedServerOps-834319867',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-ceq2vuir',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:47Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=9a05f548-167d-4fc7-b5ec-87e02ee03818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.127 183134 DEBUG nova.network.os_vif_util [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.128 183134 DEBUG nova.network.os_vif_util [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.129 183134 DEBUG nova.objects.instance [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 9a05f548-167d-4fc7-b5ec-87e02ee03818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.157 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <uuid>9a05f548-167d-4fc7-b5ec-87e02ee03818</uuid>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <name>instance-00000018</name>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1575574141</nova:name>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:31:56</nova:creationTime>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        <nova:port uuid="1c1b6dde-b8fc-4af2-9a67-11240761a805">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="serial">9a05f548-167d-4fc7-b5ec-87e02ee03818</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="uuid">9a05f548-167d-4fc7-b5ec-87e02ee03818</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.config"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:55:7f:32"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <target dev="tap1c1b6dde-b8"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/console.log" append="off"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:31:56 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:31:56 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:31:56 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:31:56 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.158 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Preparing to wait for external event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.159 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.159 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.160 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.161 183134 DEBUG nova.virt.libvirt.vif [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1575574141',display_name='tempest-TestNetworkAdvancedServerOps-server-1575574141',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1575574141',id=24,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7+DEG0j4DwrhGi6Li8X1HJY2IlLENpe8wHNWNUSYf6uplctgauCp7ClsFJ4rfPZ7qGthxQvxsuwAx1dpIqacU6XqzNEDTVhJLa2bDmNGPQnx6wh817TnRHA/3QIu2i5w==',key_name='tempest-TestNetworkAdvancedServerOps-834319867',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-ceq2vuir',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:47Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=9a05f548-167d-4fc7-b5ec-87e02ee03818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.161 183134 DEBUG nova.network.os_vif_util [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.162 183134 DEBUG nova.network.os_vif_util [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.163 183134 DEBUG os_vif [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.164 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.164 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.167 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.168 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c1b6dde-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.168 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c1b6dde-b8, col_values=(('external_ids', {'iface-id': '1c1b6dde-b8fc-4af2-9a67-11240761a805', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:7f:32', 'vm-uuid': '9a05f548-167d-4fc7-b5ec-87e02ee03818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.171 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 NetworkManager[55565]: <info>  [1769765516.1724] manager: (tap1c1b6dde-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.174 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.176 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.177 183134 INFO os_vif [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8')#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.231 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.232 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.232 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:55:7f:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.232 183134 INFO nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Using config drive#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.305 183134 DEBUG nova.compute.manager [req-48b5877b-f080-4a82-9c32-165b287e5749 req-defe1a5b-cbfc-4a8e-9a02-5057281645a4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Received event network-vif-deleted-fb902761-f001-4e8a-9c56-1bdc4fb6a88e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.482 183134 DEBUG nova.compute.manager [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-changed-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.482 183134 DEBUG nova.compute.manager [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing instance network info cache due to event network-changed-48d259d8-4396-4640-8f63-b475acc34639. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.483 183134 DEBUG oslo_concurrency.lockutils [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.778 183134 INFO nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Creating config drive at /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.config#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.782 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpst_54fqm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.910 183134 DEBUG oslo_concurrency.processutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpst_54fqm" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:56 np0005601977 kernel: tap1c1b6dde-b8: entered promiscuous mode
Jan 30 04:31:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:56Z|00220|binding|INFO|Claiming lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 for this chassis.
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.960 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 NetworkManager[55565]: <info>  [1769765516.9618] manager: (tap1c1b6dde-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 30 04:31:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:56Z|00221|binding|INFO|1c1b6dde-b8fc-4af2-9a67-11240761a805: Claiming fa:16:3e:55:7f:32 10.100.0.4
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.964 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:56Z|00222|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 ovn-installed in OVS
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.970 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:56Z|00223|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 up in Southbound
Jan 30 04:31:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:56.974 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:7f:32 10.100.0.4'], port_security=['fa:16:3e:55:7f:32 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a05f548-167d-4fc7-b5ec-87e02ee03818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d45d44c8-d301-433f-9039-6429d186e2f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '98cd4d85-cb40-4cb4-a4ca-491f05860190', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08caa929-3ce8-4aa7-a7b2-d4123f0d5025, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=1c1b6dde-b8fc-4af2-9a67-11240761a805) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:56 np0005601977 nova_compute[183130]: 2026-01-30 09:31:56.975 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:56.976 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 1c1b6dde-b8fc-4af2-9a67-11240761a805 in datapath d45d44c8-d301-433f-9039-6429d186e2f1 bound to our chassis#033[00m
Jan 30 04:31:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:56.979 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d45d44c8-d301-433f-9039-6429d186e2f1#033[00m
Jan 30 04:31:56 np0005601977 systemd-udevd[218017]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:31:56 np0005601977 systemd-machined[154431]: New machine qemu-17-instance-00000018.
Jan 30 04:31:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:56.997 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbabd90-e2af-45d2-80a5-53670e15e2eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:56.998 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd45d44c8-d1 in ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.001 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd45d44c8-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.001 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[312df798-bd62-4afb-8495-0d5b128d425b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.002 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fabd68a0-5194-4e16-b1e2-88e8300fe789]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 NetworkManager[55565]: <info>  [1769765517.0063] device (tap1c1b6dde-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:31:57 np0005601977 NetworkManager[55565]: <info>  [1769765517.0074] device (tap1c1b6dde-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:31:57 np0005601977 systemd[1]: Started Virtual Machine qemu-17-instance-00000018.
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.009 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[63a48226-6aff-4334-b5ca-e168eff4aae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.023 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bbbb2f-4a03-4349-9019-81364242ca22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.041 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fd2480-9e03-4263-8d03-07f9e37de328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.047 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3462c20f-ce29-4363-900a-253ba897e20f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 NetworkManager[55565]: <info>  [1769765517.0486] manager: (tapd45d44c8-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Jan 30 04:31:57 np0005601977 systemd-udevd[218020]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.069 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[04e94c60-b62a-45b7-84b6-cb9919044c05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.072 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d50d482a-e770-4f47-af42-27e546dacd10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 NetworkManager[55565]: <info>  [1769765517.0903] device (tapd45d44c8-d0): carrier: link connected
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.093 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e4574b28-ab85-4509-a2e8-abf569e67000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.111 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[28f5ac70-58df-49d6-b43c-7af5b7a8b287]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd45d44c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:29:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404843, 'reachable_time': 24878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218049, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.121 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6c54fd-fe0d-4ecf-9e1d-7bcf4491bb71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:29b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404843, 'tstamp': 404843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218050, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.134 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc168e7-4cf5-4b5d-8bc9-377a9f7609f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd45d44c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:29:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404843, 'reachable_time': 24878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218051, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.154 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ce4597-fe47-4805-b463-f38fab357ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.196 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1a339e57-5609-45b2-8307-155a4a53df3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.197 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd45d44c8-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.198 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.198 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd45d44c8-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:57 np0005601977 kernel: tapd45d44c8-d0: entered promiscuous mode
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.199 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:57 np0005601977 NetworkManager[55565]: <info>  [1769765517.2004] manager: (tapd45d44c8-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.203 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd45d44c8-d0, col_values=(('external_ids', {'iface-id': '1479a1c4-748b-426a-bb01-ac1ca7771477'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:57Z|00224|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.204 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.205 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.206 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a15c4f72-02d7-4a86-971e-c621fa4f9b0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.207 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-d45d44c8-d301-433f-9039-6429d186e2f1
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID d45d44c8-d301-433f-9039-6429d186e2f1
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.207 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'env', 'PROCESS_TAG=haproxy-d45d44c8-d301-433f-9039-6429d186e2f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d45d44c8-d301-433f-9039-6429d186e2f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.211 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.384 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.385 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:57.391 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:57 np0005601977 podman[218083]: 2026-01-30 09:31:57.555254469 +0000 UTC m=+0.055748586 container create d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:31:57 np0005601977 systemd[1]: Started libpod-conmon-d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738.scope.
Jan 30 04:31:57 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:31:57 np0005601977 podman[218083]: 2026-01-30 09:31:57.528847655 +0000 UTC m=+0.029341782 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:31:57 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54e7fdfc6fc440d460f3660486e64713f4d24bda361f710eb887d348bfb6f746/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:31:57 np0005601977 podman[218083]: 2026-01-30 09:31:57.641501268 +0000 UTC m=+0.141995465 container init d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:31:57 np0005601977 podman[218083]: 2026-01-30 09:31:57.647816743 +0000 UTC m=+0.148310880 container start d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.671 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765517.6705108, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.672 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Started (Lifecycle Event)#033[00m
Jan 30 04:31:57 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [NOTICE]   (218109) : New worker (218111) forked
Jan 30 04:31:57 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [NOTICE]   (218109) : Loading success.
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.694 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.700 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765517.6720312, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.701 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.931 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.936 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:31:57 np0005601977 nova_compute[183130]: 2026-01-30 09:31:57.957 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.800 183134 DEBUG nova.network.neutron [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updating instance_info_cache with network_info: [{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.826 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.826 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Instance network_info: |[{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.827 183134 DEBUG oslo_concurrency.lockutils [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.827 183134 DEBUG nova.network.neutron [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing network info cache for port 48d259d8-4396-4640-8f63-b475acc34639 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.830 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Start _get_guest_xml network_info=[{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.834 183134 WARNING nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.840 183134 DEBUG nova.virt.libvirt.host [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.841 183134 DEBUG nova.virt.libvirt.host [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.845 183134 DEBUG nova.virt.libvirt.host [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.845 183134 DEBUG nova.virt.libvirt.host [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.847 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.847 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.848 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.848 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.849 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.849 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.849 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.849 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.850 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.850 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.851 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.851 183134 DEBUG nova.virt.hardware [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.855 183134 DEBUG nova.virt.libvirt.vif [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:29:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=22,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-tuyft23r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:47Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=6ad35592-8899-48da-ac75-5702a09afa33,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.856 183134 DEBUG nova.network.os_vif_util [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.856 183134 DEBUG nova.network.os_vif_util [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.858 183134 DEBUG nova.objects.instance [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6ad35592-8899-48da-ac75-5702a09afa33 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:31:58 np0005601977 podman[218120]: 2026-01-30 09:31:58.864918557 +0000 UTC m=+0.084096997 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.874 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <uuid>6ad35592-8899-48da-ac75-5702a09afa33</uuid>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <name>instance-00000016</name>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378</nova:name>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:31:58</nova:creationTime>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        <nova:port uuid="48d259d8-4396-4640-8f63-b475acc34639">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="serial">6ad35592-8899-48da-ac75-5702a09afa33</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="uuid">6ad35592-8899-48da-ac75-5702a09afa33</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.config"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:fa:bd:ad"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <target dev="tap48d259d8-43"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/console.log" append="off"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:31:58 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:31:58 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:31:58 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:31:58 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.875 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Preparing to wait for external event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.875 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.876 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.876 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.877 183134 DEBUG nova.virt.libvirt.vif [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:29:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=22,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-tuyft23r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:31:47Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=6ad35592-8899-48da-ac75-5702a09afa33,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.877 183134 DEBUG nova.network.os_vif_util [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.878 183134 DEBUG nova.network.os_vif_util [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.878 183134 DEBUG os_vif [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.879 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.879 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.879 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.884 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.884 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48d259d8-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.885 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48d259d8-43, col_values=(('external_ids', {'iface-id': '48d259d8-4396-4640-8f63-b475acc34639', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:bd:ad', 'vm-uuid': '6ad35592-8899-48da-ac75-5702a09afa33'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:58 np0005601977 NetworkManager[55565]: <info>  [1769765518.8878] manager: (tap48d259d8-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.887 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.891 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.893 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.894 183134 INFO os_vif [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43')#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.961 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.962 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.963 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:fa:bd:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:31:58 np0005601977 nova_compute[183130]: 2026-01-30 09:31:58.964 183134 INFO nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Using config drive#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.407 183134 INFO nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Creating config drive at /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.config#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.411 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfu1tqvt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.533 183134 DEBUG oslo_concurrency.processutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfu1tqvt" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:31:59 np0005601977 kernel: tap48d259d8-43: entered promiscuous mode
Jan 30 04:31:59 np0005601977 NetworkManager[55565]: <info>  [1769765519.6134] manager: (tap48d259d8-43): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.613 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:59 np0005601977 systemd-udevd[218044]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:59 np0005601977 NetworkManager[55565]: <info>  [1769765519.6314] device (tap48d259d8-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:31:59 np0005601977 NetworkManager[55565]: <info>  [1769765519.6319] device (tap48d259d8-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:31:59 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:59Z|00225|binding|INFO|Claiming lport 48d259d8-4396-4640-8f63-b475acc34639 for this chassis.
Jan 30 04:31:59 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:59Z|00226|binding|INFO|48d259d8-4396-4640-8f63-b475acc34639: Claiming fa:16:3e:fa:bd:ad 10.100.0.11
Jan 30 04:31:59 np0005601977 systemd-machined[154431]: New machine qemu-18-instance-00000016.
Jan 30 04:31:59 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:59Z|00227|binding|INFO|Setting lport 48d259d8-4396-4640-8f63-b475acc34639 ovn-installed in OVS
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.657 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:bd:ad 10.100.0.11'], port_security=['fa:16:3e:fa:bd:ad 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6ad35592-8899-48da-ac75-5702a09afa33', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a866993-35dd-4fa6-b18e-da0d2901678a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3cddb3-a489-4457-a955-237f0d7cc907, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=48d259d8-4396-4640-8f63-b475acc34639) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:59 np0005601977 ovn_controller[95460]: 2026-01-30T09:31:59Z|00228|binding|INFO|Setting lport 48d259d8-4396-4640-8f63-b475acc34639 up in Southbound
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.658 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 48d259d8-4396-4640-8f63-b475acc34639 in datapath baf5a6be-5cb0-4dff-8451-d79eaebce0be bound to our chassis#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.658 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.660 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baf5a6be-5cb0-4dff-8451-d79eaebce0be#033[00m
Jan 30 04:31:59 np0005601977 systemd[1]: Started Virtual Machine qemu-18-instance-00000016.
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.672 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fcabc5b9-b176-47f2-bbbc-a96ad33ac05d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.694 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[006d6486-f01f-4aa4-8e8e-5b1b84f409b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.697 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[737927da-37a8-44c7-b1e4-3f6a5fdab586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.713 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[764de25b-2453-403e-a4b8-9e761a276877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.725 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba5d807-cfd8-467a-80ef-2836bfe81e3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5a6be-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389692, 'reachable_time': 36864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218179, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.738 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[51854a86-1f78-4b59-9400-d8e60aea59b7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbaf5a6be-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389700, 'tstamp': 389700}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218181, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbaf5a6be-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389702, 'tstamp': 389702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218181, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.741 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5a6be-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.743 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.745 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaf5a6be-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.746 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.746 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaf5a6be-50, col_values=(('external_ids', {'iface-id': '663ef153-23ef-4ecf-ab76-b6916e4933b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:31:59.747 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.843 183134 DEBUG nova.network.neutron [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updated VIF entry in instance network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.844 183134 DEBUG nova.network.neutron [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:31:59 np0005601977 nova_compute[183130]: 2026-01-30 09:31:59.874 183134 DEBUG oslo_concurrency.lockutils [req-f792cee9-fd0c-4a47-8137-c835e9e2fbb2 req-75f1f2f2-4fe6-4275-a305-4ff42406004d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.141 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765520.1408138, 6ad35592-8899-48da-ac75-5702a09afa33 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.142 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] VM Started (Lifecycle Event)#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.246 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.251 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765520.1421542, 6ad35592-8899-48da-ac75-5702a09afa33 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.251 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.272 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.275 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.299 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.368 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.368 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.455 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.526 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.532 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.533 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.589 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.595 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.649 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.650 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.723 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.732 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.793 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.794 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.838 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.964 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.965 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5483MB free_disk=73.32141876220703GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.965 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:00 np0005601977 nova_compute[183130]: 2026-01-30 09:32:00.965 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.088 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 93629e5c-ca92-47ac-8567-35d85b4e2a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.088 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 9a05f548-167d-4fc7-b5ec-87e02ee03818 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.088 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 6ad35592-8899-48da-ac75-5702a09afa33 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.088 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.089 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.263 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.326 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.422 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.423 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.424 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.424 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:32:01 np0005601977 nova_compute[183130]: 2026-01-30 09:32:01.473 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.075 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.075 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.094 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.161 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.162 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.169 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.169 183134 INFO nova.compute.claims [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.330 183134 DEBUG nova.compute.provider_tree [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.345 183134 DEBUG nova.scheduler.client.report [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.366 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.367 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.407 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.408 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.423 183134 INFO nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.446 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.541 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.543 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.544 183134 INFO nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Creating image(s)#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.546 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.546 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.547 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.565 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.623 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.625 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.625 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.643 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.699 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.700 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.724 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.725 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.726 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.740 183134 DEBUG nova.network.neutron [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updated VIF entry in instance network info cache for port 48d259d8-4396-4640-8f63-b475acc34639. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.741 183134 DEBUG nova.network.neutron [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updating instance_info_cache with network_info: [{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.743 183134 DEBUG nova.policy [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.772 183134 DEBUG oslo_concurrency.lockutils [req-e7647b45-128f-42ca-8860-f083f81702b0 req-06733a65-ce1c-4926-aa77-a764fb614f8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.778 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.778 183134 DEBUG nova.virt.disk.api [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.778 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.824 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.825 183134 DEBUG nova.virt.disk.api [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.825 183134 DEBUG nova.objects.instance [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.844 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.845 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Ensure instance console log exists: /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.846 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.846 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:02 np0005601977 nova_compute[183130]: 2026-01-30 09:32:02.846 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:03 np0005601977 nova_compute[183130]: 2026-01-30 09:32:03.889 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.249 183134 DEBUG nova.compute.manager [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.250 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.250 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.250 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.251 183134 DEBUG nova.compute.manager [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Processing event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.251 183134 DEBUG nova.compute.manager [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.251 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.251 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.251 183134 DEBUG oslo_concurrency.lockutils [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.252 183134 DEBUG nova.compute.manager [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.252 183134 WARNING nova.compute.manager [req-218d89c9-7fef-4946-88d7-6b7b3420dc43 req-31d723c4-23d3-4b43-a435-3894d877de7b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.252 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.257 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765525.2573335, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.257 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.260 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.264 183134 INFO nova.virt.libvirt.driver [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance spawned successfully.#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.264 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.300 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.305 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.312 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.312 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.313 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.313 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.313 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.314 183134 DEBUG nova.virt.libvirt.driver [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.348 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.413 183134 INFO nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Took 18.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.414 183134 DEBUG nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.490 183134 INFO nova.compute.manager [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Took 18.76 seconds to build instance.#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.509 183134 DEBUG oslo_concurrency.lockutils [None req-d3c7a041-c2c7-435a-8f5a-f70536850cac 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.528 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.551 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:05 np0005601977 nova_compute[183130]: 2026-01-30 09:32:05.909 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Successfully created port: f469de0f-e330-4b6b-853b-397301173e4e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:32:06 np0005601977 podman[218225]: 2026-01-30 09:32:06.824824407 +0000 UTC m=+0.046191855 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.064 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765512.0629675, 7a073e24-c800-4962-af5e-ff5400800f34 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.064 183134 INFO nova.compute.manager [-] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.089 183134 DEBUG nova.compute.manager [None req-aeaa606e-90fa-4847-894e-253356f5f344 - - - - - -] [instance: 7a073e24-c800-4962-af5e-ff5400800f34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:07Z|00229|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:32:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:07Z|00230|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.292 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:07Z|00231|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:32:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:07Z|00232|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.584 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.975 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Successfully updated port: f469de0f-e330-4b6b-853b-397301173e4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.993 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.993 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:07 np0005601977 nova_compute[183130]: 2026-01-30 09:32:07.994 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.066 183134 DEBUG nova.compute.manager [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.066 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.067 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.067 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.067 183134 DEBUG nova.compute.manager [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Processing event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.068 183134 DEBUG nova.compute.manager [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.068 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.068 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.069 183134 DEBUG oslo_concurrency.lockutils [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.069 183134 DEBUG nova.compute.manager [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] No waiting events found dispatching network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.069 183134 WARNING nova.compute.manager [req-b395d43b-7f1b-4b44-8395-42ed36f5d336 req-302380f0-ce7e-4fac-b11f-6322868504b1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received unexpected event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.070 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.073 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.074 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765528.0741463, 6ad35592-8899-48da-ac75-5702a09afa33 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.075 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.081 183134 INFO nova.virt.libvirt.driver [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Instance spawned successfully.#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.081 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.096 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.099 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.108 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.108 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.109 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.109 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.109 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.110 183134 DEBUG nova.virt.libvirt.driver [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.144 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.197 183134 INFO nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Took 20.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.198 183134 DEBUG nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.270 183134 INFO nova.compute.manager [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Took 21.47 seconds to build instance.#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.286 183134 DEBUG oslo_concurrency.lockutils [None req-f06ef591-d72a-44de-9ac5-e1143cb905ae 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.366 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.653 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.653 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.654 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.654 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 93629e5c-ca92-47ac-8567-35d85b4e2a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.657 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:32:08 np0005601977 nova_compute[183130]: 2026-01-30 09:32:08.892 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.185 183134 DEBUG nova.compute.manager [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-changed-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.185 183134 DEBUG nova.compute.manager [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing instance network info cache due to event network-changed-f469de0f-e330-4b6b-853b-397301173e4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.185 183134 DEBUG oslo_concurrency.lockutils [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.531 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.585 183134 DEBUG nova.network.neutron [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.621 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.622 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Instance network_info: |[{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.623 183134 DEBUG oslo_concurrency.lockutils [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.623 183134 DEBUG nova.network.neutron [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing network info cache for port f469de0f-e330-4b6b-853b-397301173e4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.628 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Start _get_guest_xml network_info=[{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.633 183134 WARNING nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.639 183134 DEBUG nova.virt.libvirt.host [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.640 183134 DEBUG nova.virt.libvirt.host [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.645 183134 DEBUG nova.virt.libvirt.host [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.646 183134 DEBUG nova.virt.libvirt.host [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.647 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.648 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.648 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.649 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.649 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.650 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.650 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.651 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.651 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.652 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.652 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.653 183134 DEBUG nova.virt.hardware [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.659 183134 DEBUG nova.virt.libvirt.vif [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:02Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.659 183134 DEBUG nova.network.os_vif_util [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.661 183134 DEBUG nova.network.os_vif_util [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.663 183134 DEBUG nova.objects.instance [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.678 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.696 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <uuid>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</uuid>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <name>instance-0000001b</name>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:32:10</nova:creationTime>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="serial">18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="uuid">18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:ac:3e:b3"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <target dev="tapf469de0f-e3"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log" append="off"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:32:10 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:32:10 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:32:10 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:32:10 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.698 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Preparing to wait for external event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.698 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.699 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.699 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.700 183134 DEBUG nova.virt.libvirt.vif [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:02Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.701 183134 DEBUG nova.network.os_vif_util [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.702 183134 DEBUG nova.network.os_vif_util [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.703 183134 DEBUG os_vif [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.704 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.705 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.706 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.708 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.709 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.710 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.711 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.711 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.712 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf469de0f-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.713 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf469de0f-e3, col_values=(('external_ids', {'iface-id': 'f469de0f-e330-4b6b-853b-397301173e4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:3e:b3', 'vm-uuid': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:10 np0005601977 NetworkManager[55565]: <info>  [1769765530.7170] manager: (tapf469de0f-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.718 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.719 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.720 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.720 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.723 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.724 183134 INFO os_vif [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3')#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.746 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.798 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.799 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.799 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:ac:3e:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:32:10 np0005601977 nova_compute[183130]: 2026-01-30 09:32:10.800 183134 INFO nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Using config drive#033[00m
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.706 183134 INFO nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Creating config drive at /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config#033[00m
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.710 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprq7r8l0k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.825 183134 DEBUG oslo_concurrency.processutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprq7r8l0k" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:11 np0005601977 kernel: tapf469de0f-e3: entered promiscuous mode
Jan 30 04:32:11 np0005601977 NetworkManager[55565]: <info>  [1769765531.9034] manager: (tapf469de0f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 30 04:32:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:11Z|00233|binding|INFO|Claiming lport f469de0f-e330-4b6b-853b-397301173e4e for this chassis.
Jan 30 04:32:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:11Z|00234|binding|INFO|f469de0f-e330-4b6b-853b-397301173e4e: Claiming fa:16:3e:ac:3e:b3 10.100.0.10
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.902 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:11Z|00235|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e ovn-installed in OVS
Jan 30 04:32:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:11Z|00236|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e up in Southbound
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.912 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.909 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:3e:b3 10.100.0.10'], port_security=['fa:16:3e:ac:3e:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '465bb202-1df3-4c6e-82e9-19a120fe9790', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11cd1bd5-e27d-4fc7-95b6-d09dd95ff43a, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f469de0f-e330-4b6b-853b-397301173e4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.911 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f469de0f-e330-4b6b-853b-397301173e4e in datapath 408e9205-54bc-4c8e-9fe0-c3c49be6610d bound to our chassis#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.913 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 408e9205-54bc-4c8e-9fe0-c3c49be6610d#033[00m
Jan 30 04:32:11 np0005601977 nova_compute[183130]: 2026-01-30 09:32:11.914 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.924 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad6cd93-1d3f-4c21-8ed5-015cb44d0b23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.925 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap408e9205-51 in ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.928 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap408e9205-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.928 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[81f32b4e-19e0-4bcb-b0ab-183006ae9393]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.931 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[89de8d62-903b-4a56-9199-31511ca84e83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 systemd-machined[154431]: New machine qemu-19-instance-0000001b.
Jan 30 04:32:11 np0005601977 systemd-udevd[218273]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.940 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[552382f6-ac0f-4ad6-bd7d-ebbfa7070bfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 NetworkManager[55565]: <info>  [1769765531.9505] device (tapf469de0f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:32:11 np0005601977 NetworkManager[55565]: <info>  [1769765531.9516] device (tapf469de0f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:32:11 np0005601977 systemd[1]: Started Virtual Machine qemu-19-instance-0000001b.
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.954 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ff1fae5e-e2f7-4368-a51e-044d5930caec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.980 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ba63b0c5-16ba-42d8-baa9-9dce2a54e3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:11.985 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[33993f4a-456f-4c13-a74d-a2478e6232e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:11 np0005601977 NetworkManager[55565]: <info>  [1769765531.9907] manager: (tap408e9205-50): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.009 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[173aafa5-577d-46a6-87a8-683ea98326c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.015 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[54a7ce25-189d-499e-b09b-b782383549e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 NetworkManager[55565]: <info>  [1769765532.0333] device (tap408e9205-50): carrier: link connected
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.036 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9595a7-ba6d-4706-91be-b4a90e6ec7ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.049 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[462b13fa-5164-4480-bf36-b64dd3678595]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap408e9205-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:b8:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406337, 'reachable_time': 20187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218304, 'error': None, 'target': 'ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.064 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0fa8d7-8cd4-4303-8c5a-c72bf4ca7200]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:b818'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406337, 'tstamp': 406337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218305, 'error': None, 'target': 'ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.077 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[edc04dc7-f58e-4899-bf14-f5f2716be1bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap408e9205-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:b8:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406337, 'reachable_time': 20187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218306, 'error': None, 'target': 'ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.101 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ded1362f-e864-472e-8248-0e15e08501e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.140 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[71748ca1-cd64-4d49-b8cf-2c7db05dbb2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.141 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap408e9205-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.142 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.142 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap408e9205-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:12 np0005601977 kernel: tap408e9205-50: entered promiscuous mode
Jan 30 04:32:12 np0005601977 NetworkManager[55565]: <info>  [1769765532.1446] manager: (tap408e9205-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 30 04:32:12 np0005601977 nova_compute[183130]: 2026-01-30 09:32:12.144 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.147 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap408e9205-50, col_values=(('external_ids', {'iface-id': 'afb82ca4-9bbd-4c23-b82a-439171c628d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:12 np0005601977 nova_compute[183130]: 2026-01-30 09:32:12.147 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:12 np0005601977 nova_compute[183130]: 2026-01-30 09:32:12.150 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.151 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/408e9205-54bc-4c8e-9fe0-c3c49be6610d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/408e9205-54bc-4c8e-9fe0-c3c49be6610d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:32:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:12Z|00237|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.152 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9a29f82b-af79-488f-a2b4-08f11d1a0c3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.152 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-408e9205-54bc-4c8e-9fe0-c3c49be6610d
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/408e9205-54bc-4c8e-9fe0-c3c49be6610d.pid.haproxy
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 408e9205-54bc-4c8e-9fe0-c3c49be6610d
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:32:12 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:12.154 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'env', 'PROCESS_TAG=haproxy-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/408e9205-54bc-4c8e-9fe0-c3c49be6610d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:32:12 np0005601977 nova_compute[183130]: 2026-01-30 09:32:12.158 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:12 np0005601977 podman[218336]: 2026-01-30 09:32:12.4691276 +0000 UTC m=+0.048298107 container create 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:32:12 np0005601977 systemd[1]: Started libpod-conmon-9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66.scope.
Jan 30 04:32:12 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:32:12 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4eb72d0412f2183f8b39edf5929bad799674d373be5105ec9ba0c5dae9a8b1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:32:12 np0005601977 podman[218336]: 2026-01-30 09:32:12.442140529 +0000 UTC m=+0.021311076 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:32:12 np0005601977 podman[218336]: 2026-01-30 09:32:12.541632345 +0000 UTC m=+0.120802862 container init 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 30 04:32:12 np0005601977 podman[218336]: 2026-01-30 09:32:12.547450186 +0000 UTC m=+0.126620693 container start 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:32:12 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [NOTICE]   (218356) : New worker (218358) forked
Jan 30 04:32:12 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [NOTICE]   (218356) : Loading success.
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.076 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765533.0759132, 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.077 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] VM Started (Lifecycle Event)#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.103 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.108 183134 DEBUG nova.compute.manager [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.108 183134 DEBUG nova.compute.manager [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing instance network info cache due to event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.109 183134 DEBUG oslo_concurrency.lockutils [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.109 183134 DEBUG oslo_concurrency.lockutils [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.110 183134 DEBUG nova.network.neutron [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.118 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765533.0760968, 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.118 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.132 183134 DEBUG nova.network.neutron [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated VIF entry in instance network info cache for port f469de0f-e330-4b6b-853b-397301173e4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.133 183134 DEBUG nova.network.neutron [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.167 183134 DEBUG oslo_concurrency.lockutils [req-90b8e033-8b16-4244-9e58-f5d8abd456cb req-07bdfbd5-789a-457a-a3cd-685fed0b8bb4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.168 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.171 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.197 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.701 183134 DEBUG nova.compute.manager [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-changed-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.701 183134 DEBUG nova.compute.manager [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing instance network info cache due to event network-changed-48d259d8-4396-4640-8f63-b475acc34639. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.702 183134 DEBUG oslo_concurrency.lockutils [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.702 183134 DEBUG oslo_concurrency.lockutils [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.703 183134 DEBUG nova.network.neutron [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing network info cache for port 48d259d8-4396-4640-8f63-b475acc34639 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:13 np0005601977 nova_compute[183130]: 2026-01-30 09:32:13.743 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.352 183134 DEBUG nova.compute.manager [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.353 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.353 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.353 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.354 183134 DEBUG nova.compute.manager [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Processing event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.354 183134 DEBUG nova.compute.manager [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.354 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.354 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.354 183134 DEBUG oslo_concurrency.lockutils [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.355 183134 DEBUG nova.compute.manager [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.355 183134 WARNING nova.compute.manager [req-05b115a4-883f-4d29-b302-e5290fd7ee4a req-f2d5a017-0cab-4540-85fe-9f0735d19148 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.355 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.358 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765535.3579352, 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.358 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.364 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.367 183134 INFO nova.virt.libvirt.driver [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Instance spawned successfully.#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.367 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.381 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.386 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.390 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.391 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.391 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.392 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.392 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.392 183134 DEBUG nova.virt.libvirt.driver [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.415 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.454 183134 INFO nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Took 12.91 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.455 183134 DEBUG nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.549 183134 INFO nova.compute.manager [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Took 13.41 seconds to build instance.#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.572 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.603 183134 DEBUG oslo_concurrency.lockutils [None req-05466d2f-1a64-4ac3-88ce-cb3b97f355d8 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:15 np0005601977 nova_compute[183130]: 2026-01-30 09:32:15.715 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.316 183134 DEBUG nova.network.neutron [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updated VIF entry in instance network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.316 183134 DEBUG nova.network.neutron [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.351 183134 DEBUG oslo_concurrency.lockutils [req-fe694821-6aab-4f5f-822c-e077fcc9de74 req-9c290860-6286-4842-b20c-3c58c9e04f29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:16Z|00238|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:32:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:16Z|00239|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:32:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:16Z|00240|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.403 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.496 183134 DEBUG nova.network.neutron [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updated VIF entry in instance network info cache for port 48d259d8-4396-4640-8f63-b475acc34639. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.496 183134 DEBUG nova.network.neutron [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updating instance_info_cache with network_info: [{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:16 np0005601977 nova_compute[183130]: 2026-01-30 09:32:16.529 183134 DEBUG oslo_concurrency.lockutils [req-0a82faac-0631-4c73-8c66-af7d943e0026 req-8a2baf17-7408-478c-8146-99e20a6d26e1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:16Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:7f:32 10.100.0.4
Jan 30 04:32:16 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:16Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:7f:32 10.100.0.4
Jan 30 04:32:16 np0005601977 podman[218387]: 2026-01-30 09:32:16.846333232 +0000 UTC m=+0.062945736 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:32:16 np0005601977 podman[218388]: 2026-01-30 09:32:16.850568546 +0000 UTC m=+0.065420838 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.530 183134 DEBUG nova.compute.manager [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-changed-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.531 183134 DEBUG nova.compute.manager [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing instance network info cache due to event network-changed-48d259d8-4396-4640-8f63-b475acc34639. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.531 183134 DEBUG oslo_concurrency.lockutils [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.532 183134 DEBUG oslo_concurrency.lockutils [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.532 183134 DEBUG nova.network.neutron [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Refreshing network info cache for port 48d259d8-4396-4640-8f63-b475acc34639 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:17Z|00241|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:32:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:17Z|00242|binding|INFO|Releasing lport 663ef153-23ef-4ecf-ab76-b6916e4933b1 from this chassis (sb_readonly=0)
Jan 30 04:32:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:17Z|00243|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:17 np0005601977 nova_compute[183130]: 2026-01-30 09:32:17.929 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:19Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:bd:ad 10.100.0.11
Jan 30 04:32:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:19Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:bd:ad 10.100.0.11
Jan 30 04:32:20 np0005601977 nova_compute[183130]: 2026-01-30 09:32:20.445 183134 DEBUG nova.network.neutron [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updated VIF entry in instance network info cache for port 48d259d8-4396-4640-8f63-b475acc34639. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:20 np0005601977 nova_compute[183130]: 2026-01-30 09:32:20.445 183134 DEBUG nova.network.neutron [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updating instance_info_cache with network_info: [{"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:20 np0005601977 nova_compute[183130]: 2026-01-30 09:32:20.470 183134 DEBUG oslo_concurrency.lockutils [req-514ad9e9-db38-4598-b996-5ef253d0f400 req-3c352112-3054-49c5-acdc-405be6b357aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6ad35592-8899-48da-ac75-5702a09afa33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:20 np0005601977 nova_compute[183130]: 2026-01-30 09:32:20.576 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:20 np0005601977 nova_compute[183130]: 2026-01-30 09:32:20.716 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:21 np0005601977 nova_compute[183130]: 2026-01-30 09:32:21.804 183134 DEBUG nova.compute.manager [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-changed-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:21 np0005601977 nova_compute[183130]: 2026-01-30 09:32:21.805 183134 DEBUG nova.compute.manager [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing instance network info cache due to event network-changed-f469de0f-e330-4b6b-853b-397301173e4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:21 np0005601977 nova_compute[183130]: 2026-01-30 09:32:21.805 183134 DEBUG oslo_concurrency.lockutils [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:21 np0005601977 nova_compute[183130]: 2026-01-30 09:32:21.805 183134 DEBUG oslo_concurrency.lockutils [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:21 np0005601977 nova_compute[183130]: 2026-01-30 09:32:21.805 183134 DEBUG nova.network.neutron [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing network info cache for port f469de0f-e330-4b6b-853b-397301173e4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.105 183134 INFO nova.compute.manager [None req-0ba03fbf-4e81-452e-adae-bb5de68cea6a 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Get console output#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.109 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.439 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.439 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.439 183134 INFO nova.compute.manager [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Rebooting instance#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.454 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.455 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:22 np0005601977 nova_compute[183130]: 2026-01-30 09:32:22.455 183134 DEBUG nova.network.neutron [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.670 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.670 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.671 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.671 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.671 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.672 183134 INFO nova.compute.manager [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Terminating instance#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.673 183134 DEBUG nova.compute.manager [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.687 183134 DEBUG nova.network.neutron [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.691 183134 DEBUG nova.network.neutron [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated VIF entry in instance network info cache for port f469de0f-e330-4b6b-853b-397301173e4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.692 183134 DEBUG nova.network.neutron [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:24 np0005601977 kernel: tap48d259d8-43 (unregistering): left promiscuous mode
Jan 30 04:32:24 np0005601977 NetworkManager[55565]: <info>  [1769765544.7044] device (tap48d259d8-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.708 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.711 183134 DEBUG nova.compute.manager [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.714 183134 DEBUG oslo_concurrency.lockutils [req-7dd5f4ab-e667-4928-a543-a81d79a8c50d req-0c752304-ac83-4596-a24e-891a5563b150 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:24Z|00244|binding|INFO|Releasing lport 48d259d8-4396-4640-8f63-b475acc34639 from this chassis (sb_readonly=0)
Jan 30 04:32:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:24Z|00245|binding|INFO|Setting lport 48d259d8-4396-4640-8f63-b475acc34639 down in Southbound
Jan 30 04:32:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:24Z|00246|binding|INFO|Removing iface tap48d259d8-43 ovn-installed in OVS
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.729 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:bd:ad 10.100.0.11', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6ad35592-8899-48da-ac75-5702a09afa33', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3cddb3-a489-4457-a955-237f0d7cc907, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=48d259d8-4396-4640-8f63-b475acc34639) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.730 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 48d259d8-4396-4640-8f63-b475acc34639 in datapath baf5a6be-5cb0-4dff-8451-d79eaebce0be unbound from our chassis#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.733 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baf5a6be-5cb0-4dff-8451-d79eaebce0be#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.735 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.746 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[553da38b-8520-4293-9d4f-9cd4538bd5be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 30 04:32:24 np0005601977 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Consumed 11.708s CPU time.
Jan 30 04:32:24 np0005601977 systemd-machined[154431]: Machine qemu-18-instance-00000016 terminated.
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.782 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1ef026-0ab5-419c-b934-599f07fb2b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.785 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[441d6b92-9c76-4424-b7a4-40e47c30ce51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 podman[218446]: 2026-01-30 09:32:24.800597188 +0000 UTC m=+0.054990673 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.803 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f0815f64-6d30-4051-bcb9-28a5634bda39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 podman[218447]: 2026-01-30 09:32:24.807973594 +0000 UTC m=+0.065854682 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.820 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c7928870-a270-4977-aac8-7e1c2bd4a729]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5a6be-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1370, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1370, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389692, 'reachable_time': 34868, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 13, 'inoctets': 936, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 13, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 936, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 13, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218495, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.837 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d20c56a5-c72f-4807-b88b-87403eb1a8e7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbaf5a6be-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389700, 'tstamp': 389700}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218496, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbaf5a6be-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389702, 'tstamp': 389702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218496, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.839 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5a6be-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.840 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.843 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.844 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaf5a6be-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.845 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.845 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaf5a6be-50, col_values=(('external_ids', {'iface-id': '663ef153-23ef-4ecf-ab76-b6916e4933b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:24.845 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.915 183134 INFO nova.virt.libvirt.driver [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Instance destroyed successfully.#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.915 183134 DEBUG nova.objects.instance [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid 6ad35592-8899-48da-ac75-5702a09afa33 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.930 183134 DEBUG nova.virt.libvirt.vif [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:29:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-58229378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=22,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-tuyft23r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:08Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=6ad35592-8899-48da-ac75-5702a09afa33,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.930 183134 DEBUG nova.network.os_vif_util [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "48d259d8-4396-4640-8f63-b475acc34639", "address": "fa:16:3e:fa:bd:ad", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48d259d8-43", "ovs_interfaceid": "48d259d8-4396-4640-8f63-b475acc34639", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.931 183134 DEBUG nova.network.os_vif_util [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.932 183134 DEBUG os_vif [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.933 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.933 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48d259d8-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.966 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.967 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.969 183134 INFO os_vif [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:bd:ad,bridge_name='br-int',has_traffic_filtering=True,id=48d259d8-4396-4640-8f63-b475acc34639,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48d259d8-43')#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.970 183134 INFO nova.virt.libvirt.driver [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Deleting instance files /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33_del#033[00m
Jan 30 04:32:24 np0005601977 nova_compute[183130]: 2026-01-30 09:32:24.970 183134 INFO nova.virt.libvirt.driver [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Deletion of /var/lib/nova/instances/6ad35592-8899-48da-ac75-5702a09afa33_del complete#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.020 183134 INFO nova.compute.manager [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.020 183134 DEBUG oslo.service.loopingcall [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.020 183134 DEBUG nova.compute.manager [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.021 183134 DEBUG nova.network.neutron [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.097 183134 DEBUG nova.compute.manager [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-unplugged-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.097 183134 DEBUG oslo_concurrency.lockutils [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.098 183134 DEBUG oslo_concurrency.lockutils [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.098 183134 DEBUG oslo_concurrency.lockutils [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.098 183134 DEBUG nova.compute.manager [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] No waiting events found dispatching network-vif-unplugged-48d259d8-4396-4640-8f63-b475acc34639 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.098 183134 DEBUG nova.compute.manager [req-ec2d0b8f-d69d-4fcb-a02b-d37d38b30789 req-0b076a39-6bc3-44f2-81aa-4296d06ecf97 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-unplugged-48d259d8-4396-4640-8f63-b475acc34639 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:32:25 np0005601977 nova_compute[183130]: 2026-01-30 09:32:25.577 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.014 183134 DEBUG nova.network.neutron [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.031 183134 INFO nova.compute.manager [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Took 1.01 seconds to deallocate network for instance.#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.082 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.083 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.225 183134 DEBUG nova.compute.provider_tree [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.243 183134 DEBUG nova.scheduler.client.report [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.266 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.291 183134 INFO nova.scheduler.client.report [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance 6ad35592-8899-48da-ac75-5702a09afa33#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.373 183134 DEBUG oslo_concurrency.lockutils [None req-de663e57-db94-4087-bbc0-3608e77e4d12 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:26 np0005601977 nova_compute[183130]: 2026-01-30 09:32:26.830 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 kernel: tap1c1b6dde-b8 (unregistering): left promiscuous mode
Jan 30 04:32:27 np0005601977 NetworkManager[55565]: <info>  [1769765547.0436] device (tap1c1b6dde-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00247|binding|INFO|Releasing lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 from this chassis (sb_readonly=0)
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00248|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 down in Southbound
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00249|binding|INFO|Removing iface tap1c1b6dde-b8 ovn-installed in OVS
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.052 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.056 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.058 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:7f:32 10.100.0.4'], port_security=['fa:16:3e:55:7f:32 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a05f548-167d-4fc7-b5ec-87e02ee03818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d45d44c8-d301-433f-9039-6429d186e2f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '98cd4d85-cb40-4cb4-a4ca-491f05860190', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08caa929-3ce8-4aa7-a7b2-d4123f0d5025, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=1c1b6dde-b8fc-4af2-9a67-11240761a805) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.059 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 1c1b6dde-b8fc-4af2-9a67-11240761a805 in datapath d45d44c8-d301-433f-9039-6429d186e2f1 unbound from our chassis#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.061 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d45d44c8-d301-433f-9039-6429d186e2f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.061 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ce452f7b-4806-4f82-94b1-2c2e40bee77f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.062 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 namespace which is not needed anymore#033[00m
Jan 30 04:32:27 np0005601977 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 30 04:32:27 np0005601977 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Consumed 12.410s CPU time.
Jan 30 04:32:27 np0005601977 systemd-machined[154431]: Machine qemu-17-instance-00000018 terminated.
Jan 30 04:32:27 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [NOTICE]   (218109) : haproxy version is 2.8.14-c23fe91
Jan 30 04:32:27 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [NOTICE]   (218109) : path to executable is /usr/sbin/haproxy
Jan 30 04:32:27 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [WARNING]  (218109) : Exiting Master process...
Jan 30 04:32:27 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [ALERT]    (218109) : Current worker (218111) exited with code 143 (Terminated)
Jan 30 04:32:27 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218103]: [WARNING]  (218109) : All workers exited. Exiting... (0)
Jan 30 04:32:27 np0005601977 systemd[1]: libpod-d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738.scope: Deactivated successfully.
Jan 30 04:32:27 np0005601977 podman[218550]: 2026-01-30 09:32:27.167359698 +0000 UTC m=+0.037931463 container died d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 30 04:32:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738-userdata-shm.mount: Deactivated successfully.
Jan 30 04:32:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay-54e7fdfc6fc440d460f3660486e64713f4d24bda361f710eb887d348bfb6f746-merged.mount: Deactivated successfully.
Jan 30 04:32:27 np0005601977 podman[218550]: 2026-01-30 09:32:27.197854032 +0000 UTC m=+0.068425797 container cleanup d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:32:27 np0005601977 systemd[1]: libpod-conmon-d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738.scope: Deactivated successfully.
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.208 183134 DEBUG nova.compute.manager [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.209 183134 DEBUG oslo_concurrency.lockutils [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6ad35592-8899-48da-ac75-5702a09afa33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.209 183134 DEBUG oslo_concurrency.lockutils [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.209 183134 DEBUG oslo_concurrency.lockutils [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6ad35592-8899-48da-ac75-5702a09afa33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.209 183134 DEBUG nova.compute.manager [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] No waiting events found dispatching network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.210 183134 WARNING nova.compute.manager [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received unexpected event network-vif-plugged-48d259d8-4396-4640-8f63-b475acc34639 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.210 183134 DEBUG nova.compute.manager [req-180ba53c-f6af-471a-9aae-5078bea852ef req-1342952a-031a-451f-bade-a8d621fc0519 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Received event network-vif-deleted-48d259d8-4396-4640-8f63-b475acc34639 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:27 np0005601977 podman[218581]: 2026-01-30 09:32:27.264199227 +0000 UTC m=+0.046057851 container remove d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.267 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c31433-f12b-49db-afda-e70fb7577114]: (4, ('Fri Jan 30 09:32:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 (d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738)\nd560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738\nFri Jan 30 09:32:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 (d560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738)\nd560567dae7509ecb44653aa29aa315f4a695ecff4cf1f3c0144bb401b323738\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.268 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b57bc389-90c4-43bb-88e2-767c6bd4e3b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.270 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd45d44c8-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.272 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 kernel: tapd45d44c8-d0: left promiscuous mode
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.278 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.281 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.282 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[99b455fa-f89d-4345-af4c-30f17c4471f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.293 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[47e0b1ff-f78f-4e9e-b053-1ecb95a17db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.294 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8fc22d-bd72-4a95-b694-64297f1da8c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.305 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b34bfd1d-5f19-4b66-944c-f6ac58f1b1ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404837, 'reachable_time': 38432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218612, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 systemd[1]: run-netns-ovnmeta\x2dd45d44c8\x2dd301\x2d433f\x2d9039\x2d6429d186e2f1.mount: Deactivated successfully.
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.308 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.308 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[44ae2781-feac-47ce-9b12-cc41a280bfa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:3e:b3 10.100.0.10
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:3e:b3 10.100.0.10
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.847 183134 INFO nova.virt.libvirt.driver [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance shutdown successfully.#033[00m
Jan 30 04:32:27 np0005601977 kernel: tap1c1b6dde-b8: entered promiscuous mode
Jan 30 04:32:27 np0005601977 NetworkManager[55565]: <info>  [1769765547.8966] manager: (tap1c1b6dde-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00250|binding|INFO|Claiming lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 for this chassis.
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00251|binding|INFO|1c1b6dde-b8fc-4af2-9a67-11240761a805: Claiming fa:16:3e:55:7f:32 10.100.0.4
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.898 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00252|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 ovn-installed in OVS
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.907 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 nova_compute[183130]: 2026-01-30 09:32:27.909 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:27Z|00253|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 up in Southbound
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.919 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:7f:32 10.100.0.4'], port_security=['fa:16:3e:55:7f:32 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a05f548-167d-4fc7-b5ec-87e02ee03818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d45d44c8-d301-433f-9039-6429d186e2f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '98cd4d85-cb40-4cb4-a4ca-491f05860190', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08caa929-3ce8-4aa7-a7b2-d4123f0d5025, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=1c1b6dde-b8fc-4af2-9a67-11240761a805) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.920 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 1c1b6dde-b8fc-4af2-9a67-11240761a805 in datapath d45d44c8-d301-433f-9039-6429d186e2f1 bound to our chassis#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.922 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d45d44c8-d301-433f-9039-6429d186e2f1#033[00m
Jan 30 04:32:27 np0005601977 systemd-udevd[218628]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:32:27 np0005601977 systemd-machined[154431]: New machine qemu-20-instance-00000018.
Jan 30 04:32:27 np0005601977 NetworkManager[55565]: <info>  [1769765547.9307] device (tap1c1b6dde-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:32:27 np0005601977 NetworkManager[55565]: <info>  [1769765547.9312] device (tap1c1b6dde-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:32:27 np0005601977 systemd[1]: Started Virtual Machine qemu-20-instance-00000018.
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.934 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[07452326-6ec3-4886-90ee-8cd11ec49576]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.935 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd45d44c8-d1 in ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.938 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd45d44c8-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.938 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8cb1b4-c113-4f34-a3b1-04e4a77fe0e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.940 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[908ae3b1-8b53-41ea-9ea7-e13c6e2d36d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.949 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[966ae05c-092f-465b-a7d0-d8e4568c743e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.956 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6adac179-b2c7-4810-99a8-51e033bc0a4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.978 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[05f4ee07-dea4-44eb-bdec-7dfdfc8a4948]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:27 np0005601977 systemd-udevd[218632]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:32:27 np0005601977 NetworkManager[55565]: <info>  [1769765547.9853] manager: (tapd45d44c8-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 30 04:32:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:27.984 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[17a135d4-6568-4134-9bd2-37149bc4c7db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.012 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a7df72fa-2317-4af4-94f6-c03ebfaae607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.015 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f0ee4a-3360-481b-a25b-59573f8ab388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 NetworkManager[55565]: <info>  [1769765548.0328] device (tapd45d44c8-d0): carrier: link connected
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.039 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[18d770da-95e6-443f-8c6e-d54e3238a135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.055 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8f65289b-9981-4a9d-81af-d077ab8fd36d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd45d44c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:29:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407937, 'reachable_time': 33088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218662, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.072 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8cae76df-c504-483e-88b7-e80fd2483dc4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:29b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 407937, 'tstamp': 407937}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218663, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.088 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f82adaf2-2781-44ce-8c06-bf12bdcde724]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd45d44c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:29:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407937, 'reachable_time': 33088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218664, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.115 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b94367ef-a4bd-41ca-9fdf-7bb1b498059e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.170 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[60ca9029-8b07-4b68-b384-a98773d7c339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.171 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd45d44c8-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.171 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.172 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd45d44c8-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.173 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:28 np0005601977 NetworkManager[55565]: <info>  [1769765548.1743] manager: (tapd45d44c8-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 30 04:32:28 np0005601977 kernel: tapd45d44c8-d0: entered promiscuous mode
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.177 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.180 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd45d44c8-d0, col_values=(('external_ids', {'iface-id': '1479a1c4-748b-426a-bb01-ac1ca7771477'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.181 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:28 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:28Z|00254|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.182 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.184 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.185 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4d5a34-a03e-4bcd-ac6b-3178fed206d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.186 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-d45d44c8-d301-433f-9039-6429d186e2f1
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/d45d44c8-d301-433f-9039-6429d186e2f1.pid.haproxy
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID d45d44c8-d301-433f-9039-6429d186e2f1
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.186 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:28.187 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'env', 'PROCESS_TAG=haproxy-d45d44c8-d301-433f-9039-6429d186e2f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d45d44c8-d301-433f-9039-6429d186e2f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.379 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Removed pending event for 9a05f548-167d-4fc7-b5ec-87e02ee03818 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.380 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765548.3794916, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.380 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.384 183134 INFO nova.virt.libvirt.driver [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance running successfully.#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.385 183134 INFO nova.virt.libvirt.driver [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance soft rebooted successfully.#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.385 183134 DEBUG nova.compute.manager [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.418 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.422 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.454 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.454 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765548.3806627, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.455 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Started (Lifecycle Event)#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.464 183134 DEBUG oslo_concurrency.lockutils [None req-7600798a-2f0b-4926-bd30-807e7a4c54c0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.484 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:28 np0005601977 nova_compute[183130]: 2026-01-30 09:32:28.489 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:32:28 np0005601977 podman[218703]: 2026-01-30 09:32:28.512514025 +0000 UTC m=+0.039522640 container create 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:32:28 np0005601977 systemd[1]: Started libpod-conmon-8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10.scope.
Jan 30 04:32:28 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:32:28 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5113a8de78b7a60b57cad50914f268564c9627ba6f161c2b27f0f2de603f2859/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:32:28 np0005601977 podman[218703]: 2026-01-30 09:32:28.491508369 +0000 UTC m=+0.018516984 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:32:28 np0005601977 podman[218703]: 2026-01-30 09:32:28.596705273 +0000 UTC m=+0.123713908 container init 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:32:28 np0005601977 podman[218703]: 2026-01-30 09:32:28.605235053 +0000 UTC m=+0.132243708 container start 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:32:28 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [NOTICE]   (218722) : New worker (218724) forked
Jan 30 04:32:28 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [NOTICE]   (218722) : Loading success.
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.315 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.317 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.318 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.318 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.319 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.320 183134 WARNING nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.321 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.321 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.322 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.322 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.323 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.324 183134 WARNING nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.324 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.325 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.326 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.326 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.327 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.327 183134 WARNING nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.328 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.328 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.329 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.329 183134 DEBUG oslo_concurrency.lockutils [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.330 183134 DEBUG nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.330 183134 WARNING nova.compute.manager [req-b5802271-af51-40fe-9383-c930f9f34160 req-d59b74da-8005-42ff-b707-0f0b2c533c4b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.657 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.658 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.658 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.659 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.659 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.660 183134 INFO nova.compute.manager [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Terminating instance#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.662 183134 DEBUG nova.compute.manager [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:32:29 np0005601977 kernel: tap695209cb-0d (unregistering): left promiscuous mode
Jan 30 04:32:29 np0005601977 NetworkManager[55565]: <info>  [1769765549.6956] device (tap695209cb-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:32:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:29Z|00255|binding|INFO|Releasing lport 695209cb-0de3-443c-9e7f-c65894975f23 from this chassis (sb_readonly=0)
Jan 30 04:32:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:29Z|00256|binding|INFO|Setting lport 695209cb-0de3-443c-9e7f-c65894975f23 down in Southbound
Jan 30 04:32:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:29Z|00257|binding|INFO|Removing iface tap695209cb-0d ovn-installed in OVS
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.698 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.700 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.704 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:ea:ed 10.100.0.8'], port_security=['fa:16:3e:33:ea:ed 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93629e5c-ca92-47ac-8567-35d85b4e2a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a866993-35dd-4fa6-b18e-da0d2901678a 6a1909f5-bead-4d28-9b18-810f48b11797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3cddb3-a489-4457-a955-237f0d7cc907, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=695209cb-0de3-443c-9e7f-c65894975f23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.706 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 695209cb-0de3-443c-9e7f-c65894975f23 in datapath baf5a6be-5cb0-4dff-8451-d79eaebce0be unbound from our chassis#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.707 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baf5a6be-5cb0-4dff-8451-d79eaebce0be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.707 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.708 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d1e2e6-078a-41d4-9b8f-0b6a1d47ad89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.709 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be namespace which is not needed anymore#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.718 183134 DEBUG nova.compute.manager [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-changed-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.719 183134 DEBUG nova.compute.manager [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing instance network info cache due to event network-changed-695209cb-0de3-443c-9e7f-c65894975f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.719 183134 DEBUG oslo_concurrency.lockutils [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.720 183134 DEBUG oslo_concurrency.lockutils [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.720 183134 DEBUG nova.network.neutron [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Refreshing network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:29 np0005601977 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 30 04:32:29 np0005601977 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Consumed 18.956s CPU time.
Jan 30 04:32:29 np0005601977 systemd-machined[154431]: Machine qemu-12-instance-00000014 terminated.
Jan 30 04:32:29 np0005601977 podman[218734]: 2026-01-30 09:32:29.774903897 +0000 UTC m=+0.065696048 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [NOTICE]   (216052) : haproxy version is 2.8.14-c23fe91
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [NOTICE]   (216052) : path to executable is /usr/sbin/haproxy
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [WARNING]  (216052) : Exiting Master process...
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [WARNING]  (216052) : Exiting Master process...
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [ALERT]    (216052) : Current worker (216054) exited with code 143 (Terminated)
Jan 30 04:32:29 np0005601977 neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be[216048]: [WARNING]  (216052) : All workers exited. Exiting... (0)
Jan 30 04:32:29 np0005601977 systemd[1]: libpod-0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48.scope: Deactivated successfully.
Jan 30 04:32:29 np0005601977 podman[218779]: 2026-01-30 09:32:29.822277755 +0000 UTC m=+0.049408589 container died 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:32:29 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48-userdata-shm.mount: Deactivated successfully.
Jan 30 04:32:29 np0005601977 systemd[1]: var-lib-containers-storage-overlay-78de20b395c1a1012a1f79dfd12ff1ff3d2ab27fb225c22e144a19b893a92898-merged.mount: Deactivated successfully.
Jan 30 04:32:29 np0005601977 podman[218779]: 2026-01-30 09:32:29.85858151 +0000 UTC m=+0.085712374 container cleanup 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:32:29 np0005601977 systemd[1]: libpod-conmon-0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48.scope: Deactivated successfully.
Jan 30 04:32:29 np0005601977 NetworkManager[55565]: <info>  [1769765549.8788] manager: (tap695209cb-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.879 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.884 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.908 183134 INFO nova.virt.libvirt.driver [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Instance destroyed successfully.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.908 183134 DEBUG nova.objects.instance [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid 93629e5c-ca92-47ac-8567-35d85b4e2a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:29 np0005601977 podman[218810]: 2026-01-30 09:32:29.916125587 +0000 UTC m=+0.043066414 container remove 0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.919 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3883eef4-6bdc-4d46-9863-5584a6c122d1]: (4, ('Fri Jan 30 09:32:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be (0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48)\n0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48\nFri Jan 30 09:32:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be (0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48)\n0faa338424f0b11067be3dc2be4ab742de5d82bd3ee8d0e8c44185b11e695c48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.921 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[792be4ed-c257-476f-8ea3-2b5cbac1d523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.921 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5a6be-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:29 np0005601977 kernel: tapbaf5a6be-50: left promiscuous mode
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.923 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.929 183134 DEBUG nova.virt.libvirt.vif [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:29:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-2120167569',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=20,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJyOupxEQP5rPhxv3Ovs0buVpKo9DK1SFWIgHh1g4GNOSB04wmj6A6QDKnx5FDTCMUBmlFKzh8u77bIg75/X+JZ/jpIK2VxEM7v20lB4s0EWjtZAb/cScGOoEldqGiJNmQ==',key_name='tempest-TestSecurityGroupsBasicOps-187707995',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:29:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-6j7tnxpj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:29:28Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=93629e5c-ca92-47ac-8567-35d85b4e2a73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.930 183134 DEBUG nova.network.os_vif_util [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.931 183134 DEBUG nova.network.os_vif_util [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.932 183134 DEBUG os_vif [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.932 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a9c0b0-3840-4e71-a2fb-47a36e2c193f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.935 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.936 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap695209cb-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.937 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.940 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.941 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.944 183134 INFO os_vif [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:ea:ed,bridge_name='br-int',has_traffic_filtering=True,id=695209cb-0de3-443c-9e7f-c65894975f23,network=Network(baf5a6be-5cb0-4dff-8451-d79eaebce0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap695209cb-0d')#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.945 183134 INFO nova.virt.libvirt.driver [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Deleting instance files /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73_del#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.948 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[66722153-d610-4efd-827d-083dfd5f7ef9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.946 183134 INFO nova.virt.libvirt.driver [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Deletion of /var/lib/nova/instances/93629e5c-ca92-47ac-8567-35d85b4e2a73_del complete#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.951 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[85ce5014-49af-44e0-ba12-a478fd917edd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.966 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a4976a73-72d1-4367-ad96-6e1eb8096458]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389688, 'reachable_time': 25518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218839, 'error': None, 'target': 'ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.970 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-baf5a6be-5cb0-4dff-8451-d79eaebce0be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:32:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:29.970 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f885fd54-9a93-4fb1-bd5e-e471fc001e8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:29 np0005601977 systemd[1]: run-netns-ovnmeta\x2dbaf5a6be\x2d5cb0\x2d4dff\x2d8451\x2dd79eaebce0be.mount: Deactivated successfully.
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.999 183134 INFO nova.compute.manager [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:32:29 np0005601977 nova_compute[183130]: 2026-01-30 09:32:29.999 183134 DEBUG oslo.service.loopingcall [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:32:30 np0005601977 nova_compute[183130]: 2026-01-30 09:32:30.000 183134 DEBUG nova.compute.manager [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:32:30 np0005601977 nova_compute[183130]: 2026-01-30 09:32:30.000 183134 DEBUG nova.network.neutron [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:32:30 np0005601977 nova_compute[183130]: 2026-01-30 09:32:30.578 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.442 183134 DEBUG nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-unplugged-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.442 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.443 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.443 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.443 183134 DEBUG nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] No waiting events found dispatching network-vif-unplugged-695209cb-0de3-443c-9e7f-c65894975f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.443 183134 DEBUG nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-unplugged-695209cb-0de3-443c-9e7f-c65894975f23 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.443 183134 DEBUG nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.444 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.444 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.444 183134 DEBUG oslo_concurrency.lockutils [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.444 183134 DEBUG nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] No waiting events found dispatching network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.444 183134 WARNING nova.compute.manager [req-b9b93cfa-200c-48c6-a5d4-a9109aa4e44c req-f640f732-1731-4187-b7c1-16469bb9f39d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received unexpected event network-vif-plugged-695209cb-0de3-443c-9e7f-c65894975f23 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.826 183134 DEBUG nova.network.neutron [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.847 183134 INFO nova.compute.manager [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Took 1.85 seconds to deallocate network for instance.#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.911 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:31 np0005601977 nova_compute[183130]: 2026-01-30 09:32:31.912 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:32 np0005601977 nova_compute[183130]: 2026-01-30 09:32:32.011 183134 DEBUG nova.compute.provider_tree [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:32 np0005601977 nova_compute[183130]: 2026-01-30 09:32:32.026 183134 DEBUG nova.scheduler.client.report [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:32 np0005601977 nova_compute[183130]: 2026-01-30 09:32:32.051 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:32 np0005601977 nova_compute[183130]: 2026-01-30 09:32:32.088 183134 INFO nova.scheduler.client.report [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance 93629e5c-ca92-47ac-8567-35d85b4e2a73#033[00m
Jan 30 04:32:32 np0005601977 nova_compute[183130]: 2026-01-30 09:32:32.163 183134 DEBUG oslo_concurrency.lockutils [None req-f272c638-6427-4862-b2b2-be9269f6c9a6 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "93629e5c-ca92-47ac-8567-35d85b4e2a73" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:33 np0005601977 nova_compute[183130]: 2026-01-30 09:32:33.545 183134 DEBUG nova.compute.manager [req-93350185-791d-4157-a546-ef99e74d158c req-e59c5f80-9743-4025-b535-4130c07c431d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Received event network-vif-deleted-695209cb-0de3-443c-9e7f-c65894975f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:33 np0005601977 nova_compute[183130]: 2026-01-30 09:32:33.580 183134 DEBUG nova.network.neutron [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updated VIF entry in instance network info cache for port 695209cb-0de3-443c-9e7f-c65894975f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:33 np0005601977 nova_compute[183130]: 2026-01-30 09:32:33.581 183134 DEBUG nova.network.neutron [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Updating instance_info_cache with network_info: [{"id": "695209cb-0de3-443c-9e7f-c65894975f23", "address": "fa:16:3e:33:ea:ed", "network": {"id": "baf5a6be-5cb0-4dff-8451-d79eaebce0be", "bridge": "br-int", "label": "tempest-network-smoke--414660656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap695209cb-0d", "ovs_interfaceid": "695209cb-0de3-443c-9e7f-c65894975f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:33 np0005601977 nova_compute[183130]: 2026-01-30 09:32:33.613 183134 DEBUG oslo_concurrency.lockutils [req-8fbc5e67-0ae8-4d48-a048-3e36badffb23 req-67c3cd33-3f92-4784-b7a7-9df83d95975c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93629e5c-ca92-47ac-8567-35d85b4e2a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:34 np0005601977 nova_compute[183130]: 2026-01-30 09:32:34.977 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:35 np0005601977 nova_compute[183130]: 2026-01-30 09:32:35.132 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:35 np0005601977 nova_compute[183130]: 2026-01-30 09:32:35.189 183134 INFO nova.compute.manager [None req-aa367fec-426a-4ab8-a96d-b4d33fbfefd1 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Get console output#033[00m
Jan 30 04:32:35 np0005601977 nova_compute[183130]: 2026-01-30 09:32:35.194 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:32:35 np0005601977 nova_compute[183130]: 2026-01-30 09:32:35.579 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:37 np0005601977 podman[218840]: 2026-01-30 09:32:37.86482636 +0000 UTC m=+0.067352306 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:32:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:39Z|00258|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:32:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:39Z|00259|binding|INFO|Releasing lport 1479a1c4-748b-426a-bb01-ac1ca7771477 from this chassis (sb_readonly=0)
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.196 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.446 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.447 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.447 183134 DEBUG nova.objects.instance [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'flavor' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:39Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:7f:32 10.100.0.4
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.792 183134 DEBUG nova.objects.instance [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_requests' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.807 183134 DEBUG nova.network.neutron [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.914 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765544.9134777, 6ad35592-8899-48da-ac75-5702a09afa33 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.915 183134 INFO nova.compute.manager [-] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:32:39 np0005601977 nova_compute[183130]: 2026-01-30 09:32:39.938 183134 DEBUG nova.compute.manager [None req-977a1886-7bab-499c-b5de-607fddd70d80 - - - - - -] [instance: 6ad35592-8899-48da-ac75-5702a09afa33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:40 np0005601977 nova_compute[183130]: 2026-01-30 09:32:40.015 183134 DEBUG nova.policy [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:32:40 np0005601977 nova_compute[183130]: 2026-01-30 09:32:40.017 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:40 np0005601977 nova_compute[183130]: 2026-01-30 09:32:40.582 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:41 np0005601977 nova_compute[183130]: 2026-01-30 09:32:41.490 183134 DEBUG nova.network.neutron [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Successfully created port: 3e8e7ac3-7773-46da-922a-c24dce47f456 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:32:42 np0005601977 nova_compute[183130]: 2026-01-30 09:32:42.846 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.020 183134 DEBUG nova.network.neutron [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Successfully updated port: 3e8e7ac3-7773-46da-922a-c24dce47f456 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.036 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.036 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.037 183134 DEBUG nova.network.neutron [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.091 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.176 183134 DEBUG nova.compute.manager [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-changed-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.176 183134 DEBUG nova.compute.manager [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing instance network info cache due to event network-changed-3e8e7ac3-7773-46da-922a-c24dce47f456. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:43 np0005601977 nova_compute[183130]: 2026-01-30 09:32:43.177 183134 DEBUG oslo_concurrency.lockutils [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:44.068 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:44 np0005601977 nova_compute[183130]: 2026-01-30 09:32:44.068 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:44.070 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:32:44 np0005601977 nova_compute[183130]: 2026-01-30 09:32:44.907 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765549.9054863, 93629e5c-ca92-47ac-8567-35d85b4e2a73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:32:44 np0005601977 nova_compute[183130]: 2026-01-30 09:32:44.908 183134 INFO nova.compute.manager [-] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:32:44 np0005601977 nova_compute[183130]: 2026-01-30 09:32:44.931 183134 DEBUG nova.compute.manager [None req-fe07d2ec-a769-4ae5-885c-c50639f97bd7 - - - - - -] [instance: 93629e5c-ca92-47ac-8567-35d85b4e2a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.040 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.072 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.420 183134 INFO nova.compute.manager [None req-2cab32f3-9156-48d4-b25a-6db7a750ea92 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Get console output#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.427 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.585 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.738 183134 DEBUG nova.network.neutron [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.780 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.781 183134 DEBUG oslo_concurrency.lockutils [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.781 183134 DEBUG nova.network.neutron [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing network info cache for port 3e8e7ac3-7773-46da-922a-c24dce47f456 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.786 183134 DEBUG nova.virt.libvirt.vif [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.787 183134 DEBUG nova.network.os_vif_util [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.787 183134 DEBUG nova.network.os_vif_util [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.788 183134 DEBUG os_vif [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.789 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.789 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.789 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.792 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.793 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e8e7ac3-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.793 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e8e7ac3-77, col_values=(('external_ids', {'iface-id': '3e8e7ac3-7773-46da-922a-c24dce47f456', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:3b:1b', 'vm-uuid': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.795 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.7960] manager: (tap3e8e7ac3-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.801 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.803 183134 INFO os_vif [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77')#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.803 183134 DEBUG nova.virt.libvirt.vif [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.804 183134 DEBUG nova.network.os_vif_util [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.804 183134 DEBUG nova.network.os_vif_util [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.807 183134 DEBUG nova.virt.libvirt.guest [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] attach device xml: <interface type="ethernet">
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:e7:3b:1b"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <target dev="tap3e8e7ac3-77"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:32:45 np0005601977 nova_compute[183130]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 30 04:32:45 np0005601977 kernel: tap3e8e7ac3-77: entered promiscuous mode
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.8187] manager: (tap3e8e7ac3-77): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 30 04:32:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:45Z|00260|binding|INFO|Claiming lport 3e8e7ac3-7773-46da-922a-c24dce47f456 for this chassis.
Jan 30 04:32:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:45Z|00261|binding|INFO|3e8e7ac3-7773-46da-922a-c24dce47f456: Claiming fa:16:3e:e7:3b:1b 10.100.0.20
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.820 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.822 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.832 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.832 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:3b:1b 10.100.0.20'], port_security=['fa:16:3e:e7:3b:1b 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '947fe520-942d-4287-9e9d-738b24a6a1e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbe9ac69-dab6-405f-be15-dcf6f6e9930e, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=3e8e7ac3-7773-46da-922a-c24dce47f456) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:45Z|00262|binding|INFO|Setting lport 3e8e7ac3-7773-46da-922a-c24dce47f456 ovn-installed in OVS
Jan 30 04:32:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:45Z|00263|binding|INFO|Setting lport 3e8e7ac3-7773-46da-922a-c24dce47f456 up in Southbound
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.835 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.835 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 3e8e7ac3-7773-46da-922a-c24dce47f456 in datapath 6c079c23-8031-4776-b9b7-153f2dd27fc7 bound to our chassis#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.840 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c079c23-8031-4776-b9b7-153f2dd27fc7#033[00m
Jan 30 04:32:45 np0005601977 systemd-udevd[218880]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.851 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d0beafe7-ba95-4e32-8115-470c32c6cfc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.852 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c079c23-81 in ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.854 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c079c23-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.854 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2fbdd9-dbd3-4b3f-b3be-8cf5868d9271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.855 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ba39cadb-afa6-42db-9128-6f537d06d838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.863 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e468c4-299c-47ce-82a9-f3e36356c198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.8653] device (tap3e8e7ac3-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.8662] device (tap3e8e7ac3-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.876 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a8000ab7-d6f6-44af-9741-3fbebf099bdf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.896 183134 DEBUG nova.virt.libvirt.driver [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.896 183134 DEBUG nova.virt.libvirt.driver [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.896 183134 DEBUG nova.virt.libvirt.driver [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:ac:3e:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.896 183134 DEBUG nova.virt.libvirt.driver [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:e7:3b:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.898 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f82791c6-6130-4140-913a-1b70f82c74d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 systemd-udevd[218883]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.9050] manager: (tap6c079c23-80): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.905 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[968f72e0-46c7-471f-b203-5c992dde439d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.930 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[355ce95e-3b08-415f-947b-fcd1e1cb3407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.932 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b98f338c-fc2b-42d3-825b-dbc4a2d00f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 NetworkManager[55565]: <info>  [1769765565.9461] device (tap6c079c23-80): carrier: link connected
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.948 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[011bfcaf-0e2a-48f2-8d55-48845f125d20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.961 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5ff573-05bd-4ad3-835a-a40fe90fadf5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c079c23-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:96:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409728, 'reachable_time': 26163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218906, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.971 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[de93463b-02e2-4ccd-a72a-e0d6a2516cb4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:96e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409728, 'tstamp': 409728}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218907, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:45 np0005601977 nova_compute[183130]: 2026-01-30 09:32:45.983 183134 DEBUG nova.virt.libvirt.guest [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:32:45</nova:creationTime>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:32:45 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    <nova:port uuid="3e8e7ac3-7773-46da-922a-c24dce47f456">
Jan 30 04:32:45 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:32:45 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:32:45 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:32:45 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:32:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:45.984 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b742d99f-c171-4fc6-852a-9b14b3d9447a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c079c23-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:96:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409728, 'reachable_time': 26163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218908, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.004 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[49a00751-ff75-4509-88c6-f34900439667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:46 np0005601977 nova_compute[183130]: 2026-01-30 09:32:46.018 183134 DEBUG oslo_concurrency.lockutils [None req-829eb10c-9f22-4802-898b-859a6247a884 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.057 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbd7f05-df07-48ce-a301-d772eb14d2c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.058 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c079c23-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.058 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.059 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c079c23-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:46 np0005601977 kernel: tap6c079c23-80: entered promiscuous mode
Jan 30 04:32:46 np0005601977 nova_compute[183130]: 2026-01-30 09:32:46.107 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:46 np0005601977 NetworkManager[55565]: <info>  [1769765566.1093] manager: (tap6c079c23-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 30 04:32:46 np0005601977 nova_compute[183130]: 2026-01-30 09:32:46.110 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.111 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c079c23-80, col_values=(('external_ids', {'iface-id': 'ff915305-2000-4180-8452-99d99c6f677f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:46 np0005601977 nova_compute[183130]: 2026-01-30 09:32:46.113 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:46Z|00264|binding|INFO|Releasing lport ff915305-2000-4180-8452-99d99c6f677f from this chassis (sb_readonly=0)
Jan 30 04:32:46 np0005601977 nova_compute[183130]: 2026-01-30 09:32:46.123 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.123 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c079c23-8031-4776-b9b7-153f2dd27fc7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c079c23-8031-4776-b9b7-153f2dd27fc7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.125 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[97739124-bcee-4e63-bbb1-3d0850de4612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.126 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-6c079c23-8031-4776-b9b7-153f2dd27fc7
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/6c079c23-8031-4776-b9b7-153f2dd27fc7.pid.haproxy
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 6c079c23-8031-4776-b9b7-153f2dd27fc7
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:32:46 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:46.127 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'env', 'PROCESS_TAG=haproxy-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c079c23-8031-4776-b9b7-153f2dd27fc7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:32:46 np0005601977 podman[218940]: 2026-01-30 09:32:46.457562144 +0000 UTC m=+0.040721105 container create b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:32:46 np0005601977 systemd[1]: Started libpod-conmon-b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c.scope.
Jan 30 04:32:46 np0005601977 podman[218940]: 2026-01-30 09:32:46.437444254 +0000 UTC m=+0.020603215 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:32:46 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:32:46 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a223d6f71021fcb7d14385395842d22a96185d2fa64ffb92854e08f2047da1df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:32:46 np0005601977 podman[218940]: 2026-01-30 09:32:46.557822523 +0000 UTC m=+0.140981504 container init b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:32:46 np0005601977 podman[218940]: 2026-01-30 09:32:46.564974643 +0000 UTC m=+0.148133614 container start b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:32:46 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [NOTICE]   (218959) : New worker (218961) forked
Jan 30 04:32:46 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [NOTICE]   (218959) : Loading success.
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.022 183134 DEBUG nova.compute.manager [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.022 183134 DEBUG nova.compute.manager [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing instance network info cache due to event network-changed-1c1b6dde-b8fc-4af2-9a67-11240761a805. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.022 183134 DEBUG oslo_concurrency.lockutils [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.022 183134 DEBUG oslo_concurrency.lockutils [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.023 183134 DEBUG nova.network.neutron [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Refreshing network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.115 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.115 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.116 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.116 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.116 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.118 183134 INFO nova.compute.manager [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Terminating instance#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.119 183134 DEBUG nova.compute.manager [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:32:47 np0005601977 kernel: tap1c1b6dde-b8 (unregistering): left promiscuous mode
Jan 30 04:32:47 np0005601977 NetworkManager[55565]: <info>  [1769765567.1536] device (tap1c1b6dde-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:32:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:47Z|00265|binding|INFO|Releasing lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 from this chassis (sb_readonly=0)
Jan 30 04:32:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:47Z|00266|binding|INFO|Setting lport 1c1b6dde-b8fc-4af2-9a67-11240761a805 down in Southbound
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.160 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:47Z|00267|binding|INFO|Removing iface tap1c1b6dde-b8 ovn-installed in OVS
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.168 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.172 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:7f:32 10.100.0.4'], port_security=['fa:16:3e:55:7f:32 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a05f548-167d-4fc7-b5ec-87e02ee03818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d45d44c8-d301-433f-9039-6429d186e2f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '98cd4d85-cb40-4cb4-a4ca-491f05860190', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08caa929-3ce8-4aa7-a7b2-d4123f0d5025, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=1c1b6dde-b8fc-4af2-9a67-11240761a805) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.173 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 1c1b6dde-b8fc-4af2-9a67-11240761a805 in datapath d45d44c8-d301-433f-9039-6429d186e2f1 unbound from our chassis#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.176 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d45d44c8-d301-433f-9039-6429d186e2f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.177 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dd2ab6-a970-418d-a89a-d4fa1b0faedb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.177 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 namespace which is not needed anymore#033[00m
Jan 30 04:32:47 np0005601977 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 30 04:32:47 np0005601977 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Consumed 11.836s CPU time.
Jan 30 04:32:47 np0005601977 systemd-machined[154431]: Machine qemu-20-instance-00000018 terminated.
Jan 30 04:32:47 np0005601977 podman[218975]: 2026-01-30 09:32:47.243364952 +0000 UTC m=+0.065537812 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:32:47 np0005601977 podman[218970]: 2026-01-30 09:32:47.256957591 +0000 UTC m=+0.074232787 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, version=9.7)
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [NOTICE]   (218722) : haproxy version is 2.8.14-c23fe91
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [NOTICE]   (218722) : path to executable is /usr/sbin/haproxy
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [WARNING]  (218722) : Exiting Master process...
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [WARNING]  (218722) : Exiting Master process...
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [ALERT]    (218722) : Current worker (218724) exited with code 143 (Terminated)
Jan 30 04:32:47 np0005601977 neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1[218718]: [WARNING]  (218722) : All workers exited. Exiting... (0)
Jan 30 04:32:47 np0005601977 systemd[1]: libpod-8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10.scope: Deactivated successfully.
Jan 30 04:32:47 np0005601977 podman[219028]: 2026-01-30 09:32:47.28966283 +0000 UTC m=+0.037172111 container died 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:32:47 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10-userdata-shm.mount: Deactivated successfully.
Jan 30 04:32:47 np0005601977 systemd[1]: var-lib-containers-storage-overlay-5113a8de78b7a60b57cad50914f268564c9627ba6f161c2b27f0f2de603f2859-merged.mount: Deactivated successfully.
Jan 30 04:32:47 np0005601977 podman[219028]: 2026-01-30 09:32:47.319604918 +0000 UTC m=+0.067114199 container cleanup 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:32:47 np0005601977 systemd[1]: libpod-conmon-8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10.scope: Deactivated successfully.
Jan 30 04:32:47 np0005601977 NetworkManager[55565]: <info>  [1769765567.3357] manager: (tap1c1b6dde-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.337 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.340 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:47Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:3b:1b 10.100.0.20
Jan 30 04:32:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:47Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:3b:1b 10.100.0.20
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.363 183134 INFO nova.virt.libvirt.driver [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance destroyed successfully.#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.363 183134 DEBUG nova.objects.instance [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 9a05f548-167d-4fc7-b5ec-87e02ee03818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:47 np0005601977 podman[219060]: 2026-01-30 09:32:47.367668917 +0000 UTC m=+0.037182151 container remove 8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.370 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bc62f5e4-e5d3-459a-8ffe-b2f0db1b5247]: (4, ('Fri Jan 30 09:32:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 (8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10)\n8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10\nFri Jan 30 09:32:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 (8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10)\n8166239783baae0f2cf6e5813b6632302ed717fac695b9afbd60642df65f7a10\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.371 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[202b2c8e-a505-4d9f-abd9-49444bd83fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.372 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd45d44c8-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.374 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 kernel: tapd45d44c8-d0: left promiscuous mode
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.381 183134 DEBUG nova.virt.libvirt.vif [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1575574141',display_name='tempest-TestNetworkAdvancedServerOps-server-1575574141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1575574141',id=24,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7+DEG0j4DwrhGi6Li8X1HJY2IlLENpe8wHNWNUSYf6uplctgauCp7ClsFJ4rfPZ7qGthxQvxsuwAx1dpIqacU6XqzNEDTVhJLa2bDmNGPQnx6wh817TnRHA/3QIu2i5w==',key_name='tempest-TestNetworkAdvancedServerOps-834319867',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-ceq2vuir',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:28Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=9a05f548-167d-4fc7-b5ec-87e02ee03818,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.381 183134 DEBUG nova.network.os_vif_util [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.382 183134 DEBUG nova.network.os_vif_util [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.383 183134 DEBUG os_vif [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.383 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2294d68d-6e74-496a-b208-1c2de09a3621]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.384 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.384 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c1b6dde-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.385 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.387 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.390 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.392 183134 INFO os_vif [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:7f:32,bridge_name='br-int',has_traffic_filtering=True,id=1c1b6dde-b8fc-4af2-9a67-11240761a805,network=Network(d45d44c8-d301-433f-9039-6429d186e2f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c1b6dde-b8')#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.393 183134 INFO nova.virt.libvirt.driver [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Deleting instance files /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818_del#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.393 183134 INFO nova.virt.libvirt.driver [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Deletion of /var/lib/nova/instances/9a05f548-167d-4fc7-b5ec-87e02ee03818_del complete#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.393 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[475dea92-c7f7-4dec-bfe1-1d10051831dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.394 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4baadb36-3272-44f2-8291-05d4d5aefde8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.404 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9c733ba6-f73a-4924-b26d-20c5b74adee4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407931, 'reachable_time': 28261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219093, 'error': None, 'target': 'ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.406 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d45d44c8-d301-433f-9039-6429d186e2f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:32:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:47.407 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[e0316ac2-ad3d-43ad-aca3-c2cb09a70e95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:32:47 np0005601977 systemd[1]: run-netns-ovnmeta\x2dd45d44c8\x2dd301\x2d433f\x2d9039\x2d6429d186e2f1.mount: Deactivated successfully.
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.456 183134 INFO nova.compute.manager [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.457 183134 DEBUG oslo.service.loopingcall [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.457 183134 DEBUG nova.compute.manager [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:32:47 np0005601977 nova_compute[183130]: 2026-01-30 09:32:47.457 183134 DEBUG nova.network.neutron [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.285 183134 DEBUG nova.compute.manager [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.285 183134 DEBUG oslo_concurrency.lockutils [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.286 183134 DEBUG oslo_concurrency.lockutils [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.286 183134 DEBUG oslo_concurrency.lockutils [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.287 183134 DEBUG nova.compute.manager [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.287 183134 DEBUG nova.compute.manager [req-5595733a-8403-4995-9696-3324d8975fd9 req-0827b33b-75f6-4183-9201-81e7ab12860d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-unplugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.309 183134 DEBUG nova.network.neutron [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated VIF entry in instance network info cache for port 3e8e7ac3-7773-46da-922a-c24dce47f456. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.309 183134 DEBUG nova.network.neutron [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:48 np0005601977 nova_compute[183130]: 2026-01-30 09:32:48.357 183134 DEBUG oslo_concurrency.lockutils [req-3a775523-fb78-451a-beb3-5ca80a347a7b req-6522d1b9-7da4-4d15-8546-2a7f3a558c24 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.008 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.114 183134 DEBUG nova.network.neutron [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.146 183134 INFO nova.compute.manager [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Took 1.69 seconds to deallocate network for instance.#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.278 183134 DEBUG nova.network.neutron [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updated VIF entry in instance network info cache for port 1c1b6dde-b8fc-4af2-9a67-11240761a805. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.279 183134 DEBUG nova.network.neutron [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Updating instance_info_cache with network_info: [{"id": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "address": "fa:16:3e:55:7f:32", "network": {"id": "d45d44c8-d301-433f-9039-6429d186e2f1", "bridge": "br-int", "label": "tempest-network-smoke--228812816", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c1b6dde-b8", "ovs_interfaceid": "1c1b6dde-b8fc-4af2-9a67-11240761a805", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.299 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.300 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.375 183134 DEBUG oslo_concurrency.lockutils [req-e4f7ad89-69b4-4a31-b4b4-c01f5e44559b req-e2f637da-d7e4-449f-bec5-8feb9db47331 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-9a05f548-167d-4fc7-b5ec-87e02ee03818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.444 183134 DEBUG nova.compute.provider_tree [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.486 183134 DEBUG nova.scheduler.client.report [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.550 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.605 183134 INFO nova.scheduler.client.report [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 9a05f548-167d-4fc7-b5ec-87e02ee03818#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.748 183134 DEBUG oslo_concurrency.lockutils [None req-85a9bf24-39a7-4052-9306-649afd23619e 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:49 np0005601977 nova_compute[183130]: 2026-01-30 09:32:49.810 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.619 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.695 183134 DEBUG nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.696 183134 DEBUG oslo_concurrency.lockutils [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.696 183134 DEBUG oslo_concurrency.lockutils [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.696 183134 DEBUG oslo_concurrency.lockutils [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "9a05f548-167d-4fc7-b5ec-87e02ee03818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.697 183134 DEBUG nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] No waiting events found dispatching network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.697 183134 WARNING nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received unexpected event network-vif-plugged-1c1b6dde-b8fc-4af2-9a67-11240761a805 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.697 183134 DEBUG nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Received event network-vif-deleted-1c1b6dde-b8fc-4af2-9a67-11240761a805 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.697 183134 INFO nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Neutron deleted interface 1c1b6dde-b8fc-4af2-9a67-11240761a805; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.698 183134 DEBUG nova.network.neutron [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 30 04:32:50 np0005601977 nova_compute[183130]: 2026-01-30 09:32:50.699 183134 DEBUG nova.compute.manager [req-86679f2e-0f69-42a7-9bdd-9984fa42e6b5 req-7dab5c40-acab-479f-a9ab-4c9c49bc11af dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Detach interface failed, port_id=1c1b6dde-b8fc-4af2-9a67-11240761a805, reason: Instance 9a05f548-167d-4fc7-b5ec-87e02ee03818 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:32:52 np0005601977 nova_compute[183130]: 2026-01-30 09:32:52.385 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:54Z|00268|binding|INFO|Releasing lport ff915305-2000-4180-8452-99d99c6f677f from this chassis (sb_readonly=0)
Jan 30 04:32:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:32:54Z|00269|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:32:54 np0005601977 nova_compute[183130]: 2026-01-30 09:32:54.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.034 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.035 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.055 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.137 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.138 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.150 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.150 183134 INFO nova.compute.claims [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.280 183134 DEBUG nova.compute.provider_tree [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.294 183134 DEBUG nova.scheduler.client.report [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.334 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.334 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.394 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.395 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.416 183134 INFO nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.443 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.557 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.558 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.559 183134 INFO nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Creating image(s)#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.560 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.560 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.561 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.577 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.624 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.624 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.625 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.634 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.677 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.678 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.706 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.707 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.708 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.762 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.763 183134 DEBUG nova.virt.disk.api [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.763 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.813 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.815 183134 DEBUG nova.virt.disk.api [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.815 183134 DEBUG nova.objects.instance [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid aed146e3-865d-4aee-a055-42ed41e035c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.832 183134 DEBUG nova.policy [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:32:55 np0005601977 podman[219106]: 2026-01-30 09:32:55.833816461 +0000 UTC m=+0.048573585 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.836 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.836 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Ensure instance console log exists: /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.837 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.837 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:55 np0005601977 podman[219108]: 2026-01-30 09:32:55.838910071 +0000 UTC m=+0.051914283 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:32:55 np0005601977 nova_compute[183130]: 2026-01-30 09:32:55.838 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.044 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.044 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.067 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.231 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.232 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.239 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.240 183134 INFO nova.compute.claims [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.537 183134 DEBUG nova.compute.provider_tree [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.555 183134 DEBUG nova.scheduler.client.report [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.593 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.594 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.665 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.666 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.701 183134 INFO nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.738 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.859 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.861 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.862 183134 INFO nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Creating image(s)#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.862 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.862 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.863 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.876 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.925 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.926 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.927 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.950 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.992 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:56 np0005601977 nova_compute[183130]: 2026-01-30 09:32:56.994 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.023 183134 DEBUG nova.policy [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.027 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.029 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.029 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.084 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.085 183134 DEBUG nova.virt.disk.api [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Checking if we can resize image /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.085 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.140 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.141 183134 DEBUG nova.virt.disk.api [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Cannot resize image /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.141 183134 DEBUG nova.objects.instance [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'migration_context' on Instance uuid 8aafaddd-1368-427e-8596-2b5871053f79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.175 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.176 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Ensure instance console log exists: /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.176 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.177 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.177 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.369 183134 DEBUG nova.compute.manager [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.370 183134 DEBUG oslo_concurrency.lockutils [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.370 183134 DEBUG oslo_concurrency.lockutils [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.371 183134 DEBUG oslo_concurrency.lockutils [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.371 183134 DEBUG nova.compute.manager [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.371 183134 WARNING nova.compute.manager [req-1440494a-73ab-4ed8-a198-eeb6b96dbc8e req-1a3272dc-a84f-425a-a13e-c2de7788517d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:32:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:57.385 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:57.386 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:32:57.386 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.387 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.423 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Successfully created port: 8472693d-cc3d-4223-b981-b7d1e9f96531 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.732 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.733 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.760 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.856 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.857 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.864 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.864 183134 INFO nova.compute.claims [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.967 183134 DEBUG nova.scheduler.client.report [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.996 183134 DEBUG nova.scheduler.client.report [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:32:57 np0005601977 nova_compute[183130]: 2026-01-30 09:32:57.996 183134 DEBUG nova.compute.provider_tree [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.014 183134 DEBUG nova.scheduler.client.report [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.044 183134 DEBUG nova.scheduler.client.report [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.196 183134 DEBUG nova.compute.provider_tree [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.219 183134 DEBUG nova.scheduler.client.report [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.244 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.245 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.273 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Successfully created port: c0d4f325-5a98-4a02-aa86-34097b369c03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.296 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.296 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.327 183134 INFO nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.354 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.483 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.485 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.485 183134 INFO nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Creating image(s)#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.486 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.487 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.488 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.512 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.587 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.589 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.590 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.616 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.696 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.698 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.724 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.726 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.727 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.783 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.784 183134 DEBUG nova.virt.disk.api [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.785 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.829 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.830 183134 DEBUG nova.virt.disk.api [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.830 183134 DEBUG nova.objects.instance [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 65c66677-23b6-479a-863f-3dd277183a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.846 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.846 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Ensure instance console log exists: /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.847 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.847 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:58 np0005601977 nova_compute[183130]: 2026-01-30 09:32:58.847 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:59 np0005601977 nova_compute[183130]: 2026-01-30 09:32:59.020 183134 DEBUG nova.policy [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.368 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.368 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.376 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Successfully updated port: 8472693d-cc3d-4223-b981-b7d1e9f96531 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.399 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.400 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.400 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.470 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.492 183134 DEBUG nova.compute.manager [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.493 183134 DEBUG oslo_concurrency.lockutils [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.493 183134 DEBUG oslo_concurrency.lockutils [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.494 183134 DEBUG oslo_concurrency.lockutils [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.494 183134 DEBUG nova.compute.manager [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.494 183134 WARNING nova.compute.manager [req-31feabd2-caa0-407c-bbb0-13fdb3a13845 req-72efb0b5-e985-43f5-997b-a6127121278f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.521 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.522 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:00 np0005601977 podman[219181]: 2026-01-30 09:33:00.566280929 +0000 UTC m=+0.145631731 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.585 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.634 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.735 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.736 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5446MB free_disk=73.32210540771484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.736 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.737 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.815 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Successfully created port: a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.831 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.831 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance aed146e3-865d-4aee-a055-42ed41e035c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.832 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 8aafaddd-1368-427e-8596-2b5871053f79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.832 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 65c66677-23b6-479a-863f-3dd277183a7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.832 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.832 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.887 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Successfully updated port: c0d4f325-5a98-4a02-aa86-34097b369c03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.903 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.904 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquired lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.904 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.965 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:33:00 np0005601977 nova_compute[183130]: 2026-01-30 09:33:00.988 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:33:01 np0005601977 nova_compute[183130]: 2026-01-30 09:33:01.018 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:33:01 np0005601977 nova_compute[183130]: 2026-01-30 09:33:01.018 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:01 np0005601977 nova_compute[183130]: 2026-01-30 09:33:01.288 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.097 183134 DEBUG nova.network.neutron [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updating instance_info_cache with network_info: [{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.123 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.124 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Instance network_info: |[{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.127 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Start _get_guest_xml network_info=[{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.132 183134 DEBUG nova.compute.manager [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.132 183134 DEBUG nova.compute.manager [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing instance network info cache due to event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.133 183134 DEBUG oslo_concurrency.lockutils [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.134 183134 WARNING nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.142 183134 DEBUG nova.virt.libvirt.host [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.143 183134 DEBUG nova.virt.libvirt.host [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.147 183134 DEBUG nova.virt.libvirt.host [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.148 183134 DEBUG nova.virt.libvirt.host [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.149 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.149 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.150 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.150 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.150 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.151 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.151 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.151 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.152 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.152 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.152 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.153 183134 DEBUG nova.virt.hardware [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.158 183134 DEBUG nova.virt.libvirt.vif [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=28,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-k77iz630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:55Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=aed146e3-865d-4aee-a055-42ed41e035c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.158 183134 DEBUG nova.network.os_vif_util [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.159 183134 DEBUG nova.network.os_vif_util [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.160 183134 DEBUG nova.objects.instance [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid aed146e3-865d-4aee-a055-42ed41e035c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.183 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <uuid>aed146e3-865d-4aee-a055-42ed41e035c5</uuid>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <name>instance-0000001c</name>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611</nova:name>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:02</nova:creationTime>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        <nova:port uuid="8472693d-cc3d-4223-b981-b7d1e9f96531">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="serial">aed146e3-865d-4aee-a055-42ed41e035c5</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="uuid">aed146e3-865d-4aee-a055-42ed41e035c5</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.config"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:87:b2:b6"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <target dev="tap8472693d-cc"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/console.log" append="off"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:02 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:02 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:02 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:02 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.184 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Preparing to wait for external event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.184 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.184 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.184 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.185 183134 DEBUG nova.virt.libvirt.vif [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=28,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-k77iz630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:55Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=aed146e3-865d-4aee-a055-42ed41e035c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.186 183134 DEBUG nova.network.os_vif_util [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.186 183134 DEBUG nova.network.os_vif_util [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.187 183134 DEBUG os_vif [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.187 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.188 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.188 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.192 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.193 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8472693d-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.193 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8472693d-cc, col_values=(('external_ids', {'iface-id': '8472693d-cc3d-4223-b981-b7d1e9f96531', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:b2:b6', 'vm-uuid': 'aed146e3-865d-4aee-a055-42ed41e035c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.195 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:02 np0005601977 NetworkManager[55565]: <info>  [1769765582.1965] manager: (tap8472693d-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.197 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.199 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.200 183134 INFO os_vif [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc')#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.253 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.255 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.255 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:87:b2:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.256 183134 INFO nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Using config drive#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.362 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765567.3619654, 9a05f548-167d-4fc7-b5ec-87e02ee03818 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.363 183134 INFO nova.compute.manager [-] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.389 183134 DEBUG nova.compute.manager [None req-068e45a5-3829-4d97-a0f3-cd3d5f7a5f4a - - - - - -] [instance: 9a05f548-167d-4fc7-b5ec-87e02ee03818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.924 183134 INFO nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Creating config drive at /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.config#033[00m
Jan 30 04:33:02 np0005601977 nova_compute[183130]: 2026-01-30 09:33:02.929 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_rqnjo2r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.021 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Successfully updated port: a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.034 183134 DEBUG nova.network.neutron [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updating instance_info_cache with network_info: [{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.038 183134 DEBUG nova.compute.manager [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.038 183134 DEBUG nova.compute.manager [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing instance network info cache due to event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.039 183134 DEBUG oslo_concurrency.lockutils [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.039 183134 DEBUG oslo_concurrency.lockutils [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.039 183134 DEBUG nova.network.neutron [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.045 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.045 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.045 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.049 183134 DEBUG oslo_concurrency.processutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_rqnjo2r" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.068 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Releasing lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.068 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Instance network_info: |[{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.069 183134 DEBUG oslo_concurrency.lockutils [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.069 183134 DEBUG nova.network.neutron [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.074 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Start _get_guest_xml network_info=[{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.085 183134 WARNING nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.0900] manager: (tap8472693d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 30 04:33:03 np0005601977 kernel: tap8472693d-cc: entered promiscuous mode
Jan 30 04:33:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:03Z|00270|binding|INFO|Claiming lport 8472693d-cc3d-4223-b981-b7d1e9f96531 for this chassis.
Jan 30 04:33:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:03Z|00271|binding|INFO|8472693d-cc3d-4223-b981-b7d1e9f96531: Claiming fa:16:3e:87:b2:b6 10.100.0.4
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.092 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.094 183134 DEBUG nova.virt.libvirt.host [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.095 183134 DEBUG nova.virt.libvirt.host [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:03Z|00272|binding|INFO|Setting lport 8472693d-cc3d-4223-b981-b7d1e9f96531 ovn-installed in OVS
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.103 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:03Z|00273|binding|INFO|Setting lport 8472693d-cc3d-4223-b981-b7d1e9f96531 up in Southbound
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.105 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b2:b6 10.100.0.4'], port_security=['fa:16:3e:87:b2:b6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-632dc37f-a471-48f7-998e-601c234d5eea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b89a168-ee83-4ac9-852d-dbd31b3e41f9 4c85f148-14e9-414e-82f7-3cd927a329dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3c73d5-3fb7-4892-bbfe-678dc6ae4603, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=8472693d-cc3d-4223-b981-b7d1e9f96531) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.107 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 8472693d-cc3d-4223-b981-b7d1e9f96531 in datapath 632dc37f-a471-48f7-998e-601c234d5eea bound to our chassis#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.108 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 632dc37f-a471-48f7-998e-601c234d5eea#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.110 183134 DEBUG nova.virt.libvirt.host [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.111 183134 DEBUG nova.virt.libvirt.host [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.113 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.113 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.114 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.114 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.114 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.114 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.115 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.115 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.115 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.115 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.116 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.116 183134 DEBUG nova.virt.hardware [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.116 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[557e4c34-8a41-4e59-b79e-d9d3d602d453]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.117 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap632dc37f-a1 in ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:33:03 np0005601977 systemd-udevd[219231]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.119 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap632dc37f-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.120 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[657ff78b-cdfd-4e5c-96e5-67c762e4aa10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.120 183134 DEBUG nova.virt.libvirt.vif [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1086942880',display_name='tempest-TestSnapshotPattern-server-1086942880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1086942880',id=29,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-izyluhvh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:56Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=8aafaddd-1368-427e-8596-2b5871053f79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.121 183134 DEBUG nova.network.os_vif_util [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.121 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3e7c45-cf80-4310-a8b7-d7d962b6b6a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 systemd-machined[154431]: New machine qemu-21-instance-0000001c.
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.122 183134 DEBUG nova.network.os_vif_util [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.122 183134 DEBUG nova.objects.instance [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8aafaddd-1368-427e-8596-2b5871053f79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.1270] device (tap8472693d-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.1281] device (tap8472693d-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.129 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e84e38-0fcf-4477-8bd7-38f8aa14b579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 systemd[1]: Started Virtual Machine qemu-21-instance-0000001c.
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.139 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e0503990-6cd2-4fc9-923c-99a0ac008d92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.140 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <uuid>8aafaddd-1368-427e-8596-2b5871053f79</uuid>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <name>instance-0000001d</name>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestSnapshotPattern-server-1086942880</nova:name>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:03</nova:creationTime>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:user uuid="7701defc672143599a29756b7b25b4dc">tempest-TestSnapshotPattern-1319331586-project-member</nova:user>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:project uuid="8960c51c5e7f4c65928b539d6bd01b08">tempest-TestSnapshotPattern-1319331586</nova:project>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        <nova:port uuid="c0d4f325-5a98-4a02-aa86-34097b369c03">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="serial">8aafaddd-1368-427e-8596-2b5871053f79</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="uuid">8aafaddd-1368-427e-8596-2b5871053f79</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.config"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:8b:bd:f9"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <target dev="tapc0d4f325-5a"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/console.log" append="off"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:03 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:03 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:03 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:03 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.140 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Preparing to wait for external event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.141 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.141 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.142 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.143 183134 DEBUG nova.virt.libvirt.vif [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1086942880',display_name='tempest-TestSnapshotPattern-server-1086942880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1086942880',id=29,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-izyluhvh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:56Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=8aafaddd-1368-427e-8596-2b5871053f79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.143 183134 DEBUG nova.network.os_vif_util [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.144 183134 DEBUG nova.network.os_vif_util [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.144 183134 DEBUG os_vif [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.145 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.145 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.146 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.148 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.148 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0d4f325-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.149 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0d4f325-5a, col_values=(('external_ids', {'iface-id': 'c0d4f325-5a98-4a02-aa86-34097b369c03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:bd:f9', 'vm-uuid': '8aafaddd-1368-427e-8596-2b5871053f79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.1511] manager: (tapc0d4f325-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.150 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.154 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.156 183134 INFO os_vif [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a')#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.161 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[27f5f6e4-e1dd-4488-9af7-acaa4984fd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.1669] manager: (tap632dc37f-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.166 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[17235876-fe93-4791-b540-7d08b6109516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 systemd-udevd[219234]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.191 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e06d8e68-035c-4a63-a509-7a7a69de4be3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.193 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[adf89e61-982e-40db-a74c-b8f31a372fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.2103] device (tap632dc37f-a0): carrier: link connected
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.214 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[79a509ef-7373-4045-b336-41f495b6bfbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.227 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac7a157-fd16-402f-b4f5-d017dd286aa0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap632dc37f-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:e5:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411455, 'reachable_time': 25559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219267, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.235 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.236 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.236 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No VIF found with MAC fa:16:3e:8b:bd:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.236 183134 INFO nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Using config drive#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.238 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d775447d-a461-4816-a0a5-c3c9904d43cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:e592'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411455, 'tstamp': 411455}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219268, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.254 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[74ac7912-d8ac-4e28-8d85-a6708dd32fd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap632dc37f-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:e5:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411455, 'reachable_time': 25559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219269, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.281 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[828d9232-a20d-46f1-9897-fedcf3f22519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.335 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6d072d23-9f8a-413a-a113-91913f66fd01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.337 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap632dc37f-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.337 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.338 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap632dc37f-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 NetworkManager[55565]: <info>  [1769765583.3405] manager: (tap632dc37f-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.340 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 kernel: tap632dc37f-a0: entered promiscuous mode
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.344 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.346 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap632dc37f-a0, col_values=(('external_ids', {'iface-id': '13570b6a-d879-43dc-b830-8118569a82b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.347 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:03Z|00274|binding|INFO|Releasing lport 13570b6a-d879-43dc-b830-8118569a82b6 from this chassis (sb_readonly=0)
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.355 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.358 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.359 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/632dc37f-a471-48f7-998e-601c234d5eea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/632dc37f-a471-48f7-998e-601c234d5eea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.360 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c0c398-444b-4357-af24-ccbdbd973b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.361 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-632dc37f-a471-48f7-998e-601c234d5eea
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/632dc37f-a471-48f7-998e-601c234d5eea.pid.haproxy
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 632dc37f-a471-48f7-998e-601c234d5eea
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:33:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:03.362 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'env', 'PROCESS_TAG=haproxy-632dc37f-a471-48f7-998e-601c234d5eea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/632dc37f-a471-48f7-998e-601c234d5eea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.467 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.627 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765583.6261263, aed146e3-865d-4aee-a055-42ed41e035c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.627 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.650 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.655 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765583.627632, aed146e3-865d-4aee-a055-42ed41e035c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.655 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.679 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.682 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:03 np0005601977 podman[219312]: 2026-01-30 09:33:03.691923237 +0000 UTC m=+0.048381679 container create e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:33:03 np0005601977 nova_compute[183130]: 2026-01-30 09:33:03.709 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:03 np0005601977 systemd[1]: Started libpod-conmon-e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8.scope.
Jan 30 04:33:03 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:33:03 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0904f9e7d534fe9dc1ee87ea4033b77a2a99ec5a5d694dcc8d83411b849a25b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:33:03 np0005601977 podman[219312]: 2026-01-30 09:33:03.669761127 +0000 UTC m=+0.026219609 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:33:03 np0005601977 podman[219312]: 2026-01-30 09:33:03.768626996 +0000 UTC m=+0.125085478 container init e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 30 04:33:03 np0005601977 podman[219312]: 2026-01-30 09:33:03.776071254 +0000 UTC m=+0.132529706 container start e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:33:03 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [NOTICE]   (219331) : New worker (219333) forked
Jan 30 04:33:03 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [NOTICE]   (219331) : Loading success.
Jan 30 04:33:04 np0005601977 nova_compute[183130]: 2026-01-30 09:33:04.715 183134 INFO nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Creating config drive at /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.config#033[00m
Jan 30 04:33:04 np0005601977 nova_compute[183130]: 2026-01-30 09:33:04.719 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp73_2_lvh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:04 np0005601977 nova_compute[183130]: 2026-01-30 09:33:04.842 183134 DEBUG oslo_concurrency.processutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp73_2_lvh" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:04 np0005601977 kernel: tapc0d4f325-5a: entered promiscuous mode
Jan 30 04:33:04 np0005601977 NetworkManager[55565]: <info>  [1769765584.9028] manager: (tapc0d4f325-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 30 04:33:04 np0005601977 systemd-udevd[219256]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:04 np0005601977 NetworkManager[55565]: <info>  [1769765584.9185] device (tapc0d4f325-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:04 np0005601977 NetworkManager[55565]: <info>  [1769765584.9192] device (tapc0d4f325-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:04Z|00275|binding|INFO|Claiming lport c0d4f325-5a98-4a02-aa86-34097b369c03 for this chassis.
Jan 30 04:33:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:04Z|00276|binding|INFO|c0d4f325-5a98-4a02-aa86-34097b369c03: Claiming fa:16:3e:8b:bd:f9 10.100.0.14
Jan 30 04:33:04 np0005601977 nova_compute[183130]: 2026-01-30 09:33:04.943 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:04Z|00277|binding|INFO|Setting lport c0d4f325-5a98-4a02-aa86-34097b369c03 ovn-installed in OVS
Jan 30 04:33:04 np0005601977 nova_compute[183130]: 2026-01-30 09:33:04.947 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:04Z|00278|binding|INFO|Setting lport c0d4f325-5a98-4a02-aa86-34097b369c03 up in Southbound
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.956 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:bd:f9 10.100.0.14'], port_security=['fa:16:3e:8b:bd:f9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8980838c-37f7-45e5-9084-1321907354d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc00530c-da00-4b1f-8544-f4f16829e051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4c33c50-4f9e-4c9d-ac8f-b1ee6c0d33bf, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=c0d4f325-5a98-4a02-aa86-34097b369c03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.958 104706 INFO neutron.agent.ovn.metadata.agent [-] Port c0d4f325-5a98-4a02-aa86-34097b369c03 in datapath 8980838c-37f7-45e5-9084-1321907354d2 bound to our chassis#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.960 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8980838c-37f7-45e5-9084-1321907354d2#033[00m
Jan 30 04:33:04 np0005601977 systemd-machined[154431]: New machine qemu-22-instance-0000001d.
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.970 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[34b5ecad-4e5b-402b-9bc8-a8d5421feecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.971 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8980838c-31 in ovnmeta-8980838c-37f7-45e5-9084-1321907354d2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.974 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8980838c-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.974 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c386e38c-14ae-4c43-a889-0d00bc3a32de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.975 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[38f8ef70-a020-4665-b8e6-855a7a647ec0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:04.987 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[8c30fca1-56fc-409d-bb4a-0f84d31078b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:04 np0005601977 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.009 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3d610729-9756-4f04-9e39-176f71471e25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.037 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[510f6222-4c2e-45a6-9aab-0e0d70eb6a27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.044 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9c49419e-4bf4-413b-a3b5-49cbeb4eaec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 NetworkManager[55565]: <info>  [1769765585.0459] manager: (tap8980838c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.070 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0ebd69-590e-4737-98a3-eda33da10e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.074 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f528e0-a943-4ba6-8514-753644a47eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 NetworkManager[55565]: <info>  [1769765585.0885] device (tap8980838c-30): carrier: link connected
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.090 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8b4ffe-a1a0-4979-89ba-28137d23ff44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.100 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bcee14a6-7bca-4d2c-b22c-079c1227bbd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8980838c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:7d:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411642, 'reachable_time': 33905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219378, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.112 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[af79f4e9-9852-4600-810c-1bd7d5c4fea8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9d:7d9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411642, 'tstamp': 411642}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219379, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.121 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[76a32331-1514-4d58-8323-3fe72fcdb7dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8980838c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:7d:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411642, 'reachable_time': 33905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219380, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.140 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1efee9-030b-4d38-874c-462637cc7f29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.171 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[123fbcad-879e-4b9a-8667-ae3d43a4f0ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.172 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8980838c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.172 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.172 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8980838c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:05 np0005601977 NetworkManager[55565]: <info>  [1769765585.1749] manager: (tap8980838c-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.174 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:05 np0005601977 kernel: tap8980838c-30: entered promiscuous mode
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.177 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.178 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8980838c-30, col_values=(('external_ids', {'iface-id': '50e26df2-7d93-4204-9b22-94b2140c0f47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.179 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:05 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:05Z|00279|binding|INFO|Releasing lport 50e26df2-7d93-4204-9b22-94b2140c0f47 from this chassis (sb_readonly=0)
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.190 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.192 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8980838c-37f7-45e5-9084-1321907354d2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8980838c-37f7-45e5-9084-1321907354d2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.193 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccfdfeb-73bb-4be2-ab92-675a83c91413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.194 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-8980838c-37f7-45e5-9084-1321907354d2
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/8980838c-37f7-45e5-9084-1321907354d2.pid.haproxy
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 8980838c-37f7-45e5-9084-1321907354d2
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:33:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:05.194 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'env', 'PROCESS_TAG=haproxy-8980838c-37f7-45e5-9084-1321907354d2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8980838c-37f7-45e5-9084-1321907354d2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:33:05 np0005601977 podman[219412]: 2026-01-30 09:33:05.508306191 +0000 UTC m=+0.033944356 container create 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:33:05 np0005601977 systemd[1]: Started libpod-conmon-9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139.scope.
Jan 30 04:33:05 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:33:05 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c7d1b2d0e7c17734df5faab5bff318464acf24aec429435e68e8127b660fe37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:33:05 np0005601977 podman[219412]: 2026-01-30 09:33:05.564735525 +0000 UTC m=+0.090373740 container init 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:33:05 np0005601977 podman[219412]: 2026-01-30 09:33:05.569771873 +0000 UTC m=+0.095410058 container start 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:33:05 np0005601977 podman[219412]: 2026-01-30 09:33:05.490296943 +0000 UTC m=+0.015935128 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:33:05 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [NOTICE]   (219431) : New worker (219433) forked
Jan 30 04:33:05 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [NOTICE]   (219431) : Loading success.
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.627 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.744 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765585.743903, 8aafaddd-1368-427e-8596-2b5871053f79 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.745 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.848 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.854 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765585.7444549, 8aafaddd-1368-427e-8596-2b5871053f79 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.855 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.879 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.882 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:05 np0005601977 nova_compute[183130]: 2026-01-30 09:33:05.905 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:06 np0005601977 nova_compute[183130]: 2026-01-30 09:33:06.019 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:06 np0005601977 nova_compute[183130]: 2026-01-30 09:33:06.927 183134 DEBUG nova.compute.manager [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-changed-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:06 np0005601977 nova_compute[183130]: 2026-01-30 09:33:06.928 183134 DEBUG nova.compute.manager [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Refreshing instance network info cache due to event network-changed-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:06 np0005601977 nova_compute[183130]: 2026-01-30 09:33:06.928 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.657 183134 DEBUG nova.network.neutron [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Updating instance_info_cache with network_info: [{"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.721 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.722 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Instance network_info: |[{"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.723 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.723 183134 DEBUG nova.network.neutron [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Refreshing network info cache for port a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.728 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Start _get_guest_xml network_info=[{"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.735 183134 WARNING nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.744 183134 DEBUG nova.virt.libvirt.host [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.744 183134 DEBUG nova.virt.libvirt.host [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.755 183134 DEBUG nova.virt.libvirt.host [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.756 183134 DEBUG nova.virt.libvirt.host [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.758 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.759 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.760 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.760 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.761 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.761 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.762 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.762 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.763 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.763 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.764 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.764 183134 DEBUG nova.virt.hardware [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.770 183134 DEBUG nova.virt.libvirt.vif [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-667740087',display_name='tempest-TestNetworkBasicOps-server-667740087',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-667740087',id=30,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKtearXovlRuDr42R5KGcjTwDUuQcl31zxpUy8NrHTEFSD+z4AYJc/VvwzKFMhyXdM5aphE+owZVs1dFxITTom8VCU4qittDL9ERuX8/wSOcgKppHIOemqZyBEgxc7eS3Q==',key_name='tempest-TestNetworkBasicOps-1937833633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-yw2tqwjc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:58Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65c66677-23b6-479a-863f-3dd277183a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.771 183134 DEBUG nova.network.os_vif_util [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.772 183134 DEBUG nova.network.os_vif_util [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.774 183134 DEBUG nova.objects.instance [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 65c66677-23b6-479a-863f-3dd277183a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.806 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <uuid>65c66677-23b6-479a-863f-3dd277183a7d</uuid>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <name>instance-0000001e</name>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-667740087</nova:name>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:07</nova:creationTime>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        <nova:port uuid="a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="serial">65c66677-23b6-479a-863f-3dd277183a7d</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="uuid">65c66677-23b6-479a-863f-3dd277183a7d</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.config"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:c9:d8:0d"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <target dev="tapa5afd5ba-13"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/console.log" append="off"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:07 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:07 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:07 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:07 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.807 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Preparing to wait for external event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.808 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.808 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.808 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.809 183134 DEBUG nova.virt.libvirt.vif [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:32:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-667740087',display_name='tempest-TestNetworkBasicOps-server-667740087',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-667740087',id=30,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKtearXovlRuDr42R5KGcjTwDUuQcl31zxpUy8NrHTEFSD+z4AYJc/VvwzKFMhyXdM5aphE+owZVs1dFxITTom8VCU4qittDL9ERuX8/wSOcgKppHIOemqZyBEgxc7eS3Q==',key_name='tempest-TestNetworkBasicOps-1937833633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-yw2tqwjc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:32:58Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65c66677-23b6-479a-863f-3dd277183a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.810 183134 DEBUG nova.network.os_vif_util [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.810 183134 DEBUG nova.network.os_vif_util [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.811 183134 DEBUG os_vif [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.811 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.812 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.812 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.815 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.816 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5afd5ba-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.816 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5afd5ba-13, col_values=(('external_ids', {'iface-id': 'a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:d8:0d', 'vm-uuid': '65c66677-23b6-479a-863f-3dd277183a7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.818 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:07 np0005601977 NetworkManager[55565]: <info>  [1769765587.8194] manager: (tapa5afd5ba-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.821 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.822 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.823 183134 INFO os_vif [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13')#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.859 183134 DEBUG nova.network.neutron [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updated VIF entry in instance network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.860 183134 DEBUG nova.network.neutron [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updating instance_info_cache with network_info: [{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.878 183134 DEBUG oslo_concurrency.lockutils [req-9d8f7d80-661c-45b4-a0f4-d1521e2a2a92 req-49c59516-981a-432a-be8e-9829e447d712 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.882 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.882 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.883 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:c9:d8:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:07 np0005601977 nova_compute[183130]: 2026-01-30 09:33:07.883 183134 INFO nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Using config drive#033[00m
Jan 30 04:33:08 np0005601977 nova_compute[183130]: 2026-01-30 09:33:08.756 183134 DEBUG nova.network.neutron [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updated VIF entry in instance network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:08 np0005601977 nova_compute[183130]: 2026-01-30 09:33:08.757 183134 DEBUG nova.network.neutron [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updating instance_info_cache with network_info: [{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:08 np0005601977 nova_compute[183130]: 2026-01-30 09:33:08.812 183134 DEBUG oslo_concurrency.lockutils [req-ca1b0620-db48-4fb5-ba55-15d7ea74a17c req-e13eb16a-46ca-41ed-a104-652d9bb2c180 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:08 np0005601977 podman[219452]: 2026-01-30 09:33:08.86833263 +0000 UTC m=+0.072282150 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:33:08 np0005601977 nova_compute[183130]: 2026-01-30 09:33:08.994 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.007 183134 INFO nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Creating config drive at /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.config#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.012 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3tgo0pq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.135 183134 DEBUG oslo_concurrency.processutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3tgo0pq" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:09 np0005601977 kernel: tapa5afd5ba-13: entered promiscuous mode
Jan 30 04:33:09 np0005601977 NetworkManager[55565]: <info>  [1769765589.1964] manager: (tapa5afd5ba-13): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Jan 30 04:33:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:09Z|00280|binding|INFO|Claiming lport a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e for this chassis.
Jan 30 04:33:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:09Z|00281|binding|INFO|a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e: Claiming fa:16:3e:c9:d8:0d 10.100.0.18
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.232 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:09 np0005601977 systemd-udevd[219492]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:09Z|00282|binding|INFO|Setting lport a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e ovn-installed in OVS
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.240 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:09Z|00283|binding|INFO|Setting lport a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e up in Southbound
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.243 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d8:0d 10.100.0.18'], port_security=['fa:16:3e:c9:d8:0d 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed22b354-0eec-4dad-b9f9-3e87260fdb37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbe9ac69-dab6-405f-be15-dcf6f6e9930e, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.244 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e in datapath 6c079c23-8031-4776-b9b7-153f2dd27fc7 bound to our chassis#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.245 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c079c23-8031-4776-b9b7-153f2dd27fc7#033[00m
Jan 30 04:33:09 np0005601977 NetworkManager[55565]: <info>  [1769765589.2460] device (tapa5afd5ba-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:09 np0005601977 NetworkManager[55565]: <info>  [1769765589.2470] device (tapa5afd5ba-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:09 np0005601977 systemd-machined[154431]: New machine qemu-23-instance-0000001e.
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.257 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1893d981-fe53-47ad-9a8f-8a803efeefdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 systemd[1]: Started Virtual Machine qemu-23-instance-0000001e.
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.277 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f98c98bf-e58a-4e43-97e6-ec93a0d67041]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.280 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4f230c-11a2-4511-ac7e-0a2abf792a1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.294 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[1b065b35-cc56-4114-9637-27c05314d4c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.306 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9b77de56-9b5d-4a6a-ae70-42ea69ce3a00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c079c23-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:96:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409728, 'reachable_time': 26163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219508, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.318 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f062fe-f968-4055-bf7a-6f80835ef7ca]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap6c079c23-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409736, 'tstamp': 409736}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219510, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6c079c23-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409739, 'tstamp': 409739}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219510, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.319 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c079c23-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.321 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.322 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.322 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c079c23-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.323 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.323 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c079c23-80, col_values=(('external_ids', {'iface-id': 'ff915305-2000-4180-8452-99d99c6f677f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:09.323 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.383 183134 DEBUG nova.compute.manager [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.384 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.384 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.384 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.385 183134 DEBUG nova.compute.manager [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Processing event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.385 183134 DEBUG nova.compute.manager [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.385 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.385 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.386 183134 DEBUG oslo_concurrency.lockutils [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.386 183134 DEBUG nova.compute.manager [req-cfa38a86-5d02-4852-9097-de6986f9d2d5 req-bbe2edfc-1bb9-446e-9c0e-faf79026b9bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Processing event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.387 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.388 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.392 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765589.3918219, aed146e3-865d-4aee-a055-42ed41e035c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.392 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.414 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.414 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.415 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.415 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.426 183134 INFO nova.virt.libvirt.driver [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Instance spawned successfully.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.427 183134 INFO nova.virt.libvirt.driver [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Instance spawned successfully.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.427 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.428 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.449 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.452 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.463 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.464 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.464 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.465 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.465 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.466 183134 DEBUG nova.virt.libvirt.driver [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.471 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.472 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.472 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.473 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.473 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.474 183134 DEBUG nova.virt.libvirt.driver [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.478 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.478 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765589.4019814, 8aafaddd-1368-427e-8596-2b5871053f79 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.479 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.517 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.520 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.649 183134 INFO nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Took 14.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.650 183134 DEBUG nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.650 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.672 183134 INFO nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Took 12.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.673 183134 DEBUG nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.744 183134 INFO nova.compute.manager [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Took 14.64 seconds to build instance.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.752 183134 INFO nova.compute.manager [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Took 13.64 seconds to build instance.#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.770 183134 DEBUG oslo_concurrency.lockutils [None req-7d16be13-0fdf-4861-abbb-308002cc423f 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.779 183134 DEBUG oslo_concurrency.lockutils [None req-c5d770d9-6c8e-4400-92d5-5813213467b9 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.916 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765589.9161127, 65c66677-23b6-479a-863f-3dd277183a7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.919 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.938 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.947 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765589.9162729, 65c66677-23b6-479a-863f-3dd277183a7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.947 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.966 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.970 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:09 np0005601977 nova_compute[183130]: 2026-01-30 09:33:09.988 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.384 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.385 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.386 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.387 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.631 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.728 183134 DEBUG nova.network.neutron [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Updated VIF entry in instance network info cache for port a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.728 183134 DEBUG nova.network.neutron [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Updating instance_info_cache with network_info: [{"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.751 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-65c66677-23b6-479a-863f-3dd277183a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.752 183134 DEBUG nova.compute.manager [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.752 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.753 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.753 183134 DEBUG oslo_concurrency.lockutils [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.753 183134 DEBUG nova.compute.manager [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] No waiting events found dispatching network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:10 np0005601977 nova_compute[183130]: 2026-01-30 09:33:10.754 183134 WARNING nova.compute.manager [req-03ff85c6-272a-4096-8347-2ae0007983cc req-1a50f395-a350-453d-92cf-365ea3bd041d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received unexpected event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.156 183134 DEBUG nova.compute.manager [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.156 183134 DEBUG oslo_concurrency.lockutils [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.156 183134 DEBUG oslo_concurrency.lockutils [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.156 183134 DEBUG oslo_concurrency.lockutils [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.157 183134 DEBUG nova.compute.manager [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] No waiting events found dispatching network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.157 183134 WARNING nova.compute.manager [req-d21d47e5-de9d-4182-9d24-50688bcb5869 req-4ed58d30-411f-4de7-af82-2b7b0d62f9f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received unexpected event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.758 183134 DEBUG nova.compute.manager [req-98c01fa2-1c71-4e8d-9327-8bdbdc6d98d8 req-f1fc8d2c-0426-4d3f-8b64-10ea3cd4fcef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.758 183134 DEBUG oslo_concurrency.lockutils [req-98c01fa2-1c71-4e8d-9327-8bdbdc6d98d8 req-f1fc8d2c-0426-4d3f-8b64-10ea3cd4fcef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.759 183134 DEBUG oslo_concurrency.lockutils [req-98c01fa2-1c71-4e8d-9327-8bdbdc6d98d8 req-f1fc8d2c-0426-4d3f-8b64-10ea3cd4fcef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.759 183134 DEBUG oslo_concurrency.lockutils [req-98c01fa2-1c71-4e8d-9327-8bdbdc6d98d8 req-f1fc8d2c-0426-4d3f-8b64-10ea3cd4fcef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.759 183134 DEBUG nova.compute.manager [req-98c01fa2-1c71-4e8d-9327-8bdbdc6d98d8 req-f1fc8d2c-0426-4d3f-8b64-10ea3cd4fcef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Processing event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.760 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.764 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765592.7647023, 65c66677-23b6-479a-863f-3dd277183a7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.765 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.790 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.793 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.809 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.831 183134 INFO nova.virt.libvirt.driver [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Instance spawned successfully.#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.833 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.857 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.858 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.865 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.865 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.866 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.866 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.867 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.867 183134 DEBUG nova.virt.libvirt.driver [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.939 183134 INFO nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Took 14.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:12 np0005601977 nova_compute[183130]: 2026-01-30 09:33:12.940 183134 DEBUG nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:13 np0005601977 nova_compute[183130]: 2026-01-30 09:33:13.017 183134 INFO nova.compute.manager [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Took 15.19 seconds to build instance.#033[00m
Jan 30 04:33:13 np0005601977 nova_compute[183130]: 2026-01-30 09:33:13.035 183134 DEBUG oslo_concurrency.lockutils [None req-f37870d8-00c0-4fca-9d11-df9dd9d41276 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:13 np0005601977 nova_compute[183130]: 2026-01-30 09:33:13.381 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.868 183134 DEBUG nova.compute.manager [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.869 183134 DEBUG oslo_concurrency.lockutils [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.870 183134 DEBUG oslo_concurrency.lockutils [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.870 183134 DEBUG oslo_concurrency.lockutils [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.870 183134 DEBUG nova.compute.manager [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] No waiting events found dispatching network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:14 np0005601977 nova_compute[183130]: 2026-01-30 09:33:14.870 183134 WARNING nova.compute.manager [req-fc4b2d1b-cd65-4f81-a7fc-3c68037318c9 req-33563e74-1b1d-4570-b638-3157cdcdf3bc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received unexpected event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e for instance with vm_state active and task_state None.#033[00m
Jan 30 04:33:15 np0005601977 nova_compute[183130]: 2026-01-30 09:33:15.632 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.640 183134 DEBUG nova.compute.manager [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.642 183134 DEBUG nova.compute.manager [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing instance network info cache due to event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.642 183134 DEBUG oslo_concurrency.lockutils [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.643 183134 DEBUG oslo_concurrency.lockutils [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.643 183134 DEBUG nova.network.neutron [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.798 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:17 np0005601977 podman[219525]: 2026-01-30 09:33:17.833878275 +0000 UTC m=+0.051694286 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git)
Jan 30 04:33:17 np0005601977 nova_compute[183130]: 2026-01-30 09:33:17.859 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:17 np0005601977 podman[219526]: 2026-01-30 09:33:17.882075208 +0000 UTC m=+0.095792859 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.000 183134 DEBUG nova.compute.manager [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.001 183134 DEBUG nova.compute.manager [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing instance network info cache due to event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.001 183134 DEBUG oslo_concurrency.lockutils [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.002 183134 DEBUG oslo_concurrency.lockutils [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.002 183134 DEBUG nova.network.neutron [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.047 183134 DEBUG nova.network.neutron [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updated VIF entry in instance network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.047 183134 DEBUG nova.network.neutron [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updating instance_info_cache with network_info: [{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.088 183134 DEBUG oslo_concurrency.lockutils [req-375a32f7-d6b1-42e6-b429-e5210150126a req-6ee80a0b-9c42-4e8d-8560-717a2227950e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:20Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:bd:f9 10.100.0.14
Jan 30 04:33:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:20Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:bd:f9 10.100.0.14
Jan 30 04:33:20 np0005601977 nova_compute[183130]: 2026-01-30 09:33:20.674 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:21 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:21Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:b2:b6 10.100.0.4
Jan 30 04:33:21 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:21Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:b2:b6 10.100.0.4
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.039 183134 DEBUG nova.network.neutron [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updated VIF entry in instance network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.040 183134 DEBUG nova.network.neutron [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updating instance_info_cache with network_info: [{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.083 183134 DEBUG oslo_concurrency.lockutils [req-8ba8aa89-a216-4b46-8625-e897cb79d6ef req-b9a29099-9e69-4353-9838-d8e9d3d45db2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.230 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.231 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.254 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.352 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.352 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.360 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.361 183134 INFO nova.compute.claims [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.770 183134 DEBUG nova.compute.provider_tree [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.814 183134 DEBUG nova.scheduler.client.report [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:33:22 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.896 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:22.999 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.000 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.061 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.061 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.078 183134 INFO nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.099 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.203 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.205 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.205 183134 INFO nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating image(s)#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.206 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.207 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.208 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.228 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.279 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.279 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.280 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.292 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.341 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.342 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.371 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.372 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.372 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.414 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.415 183134 DEBUG nova.virt.disk.api [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.416 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.472 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.473 183134 DEBUG nova.virt.disk.api [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.474 183134 DEBUG nova.objects.instance [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.536 183134 DEBUG nova.policy [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.556 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.556 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Ensure instance console log exists: /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.556 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.557 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:23 np0005601977 nova_compute[183130]: 2026-01-30 09:33:23.557 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:23Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:d8:0d 10.100.0.18
Jan 30 04:33:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:23Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:d8:0d 10.100.0.18
Jan 30 04:33:25 np0005601977 nova_compute[183130]: 2026-01-30 09:33:25.650 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Successfully created port: 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:33:25 np0005601977 nova_compute[183130]: 2026-01-30 09:33:25.676 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:26 np0005601977 podman[219634]: 2026-01-30 09:33:26.863803779 +0000 UTC m=+0.076146244 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:33:26 np0005601977 podman[219635]: 2026-01-30 09:33:26.869428983 +0000 UTC m=+0.072865337 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.080 183134 DEBUG nova.compute.manager [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.132 183134 INFO nova.compute.manager [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] instance snapshotting#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.419 183134 INFO nova.virt.libvirt.driver [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Beginning live snapshot process#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.717 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Successfully updated port: 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.762 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.762 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.762 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.899 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.923 183134 DEBUG nova.compute.manager [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.924 183134 DEBUG nova.compute.manager [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing instance network info cache due to event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.924 183134 DEBUG oslo_concurrency.lockutils [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.949 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:27 np0005601977 virtqemud[182587]: invalid argument: disk vda does not have an active block job
Jan 30 04:33:27 np0005601977 nova_compute[183130]: 2026-01-30 09:33:27.964 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.039 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json -f qcow2" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.040 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.114 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json -f qcow2" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.139 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.198 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.199 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.232 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d.delta 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.233 183134 INFO nova.virt.libvirt.driver [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.284 183134 DEBUG nova.virt.libvirt.guest [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.787 183134 DEBUG nova.virt.libvirt.guest [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.791 183134 INFO nova.virt.libvirt.driver [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.831 183134 DEBUG nova.privsep.utils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:33:28 np0005601977 nova_compute[183130]: 2026-01-30 09:33:28.832 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d.delta /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.205 183134 DEBUG oslo_concurrency.processutils [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d.delta /var/lib/nova/instances/snapshots/tmpi_6p8lkm/227de697b99d4de9b4f073b6f2b4d96d" returned: 0 in 0.373s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.210 183134 INFO nova.virt.libvirt.driver [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Snapshot extracted, beginning image upload#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.412 183134 DEBUG nova.network.neutron [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updating instance_info_cache with network_info: [{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.431 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.432 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance network_info: |[{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.433 183134 DEBUG oslo_concurrency.lockutils [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.433 183134 DEBUG nova.network.neutron [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.437 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start _get_guest_xml network_info=[{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.442 183134 WARNING nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.447 183134 DEBUG nova.virt.libvirt.host [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.447 183134 DEBUG nova.virt.libvirt.host [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.450 183134 DEBUG nova.virt.libvirt.host [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.450 183134 DEBUG nova.virt.libvirt.host [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.451 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.452 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.452 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.452 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.452 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.452 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.453 183134 DEBUG nova.virt.hardware [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.457 183134 DEBUG nova.virt.libvirt.vif [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:23Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.457 183134 DEBUG nova.network.os_vif_util [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.458 183134 DEBUG nova.network.os_vif_util [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.458 183134 DEBUG nova.objects.instance [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.474 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <uuid>0e693c72-183a-4005-8891-207b95ad22b1</uuid>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <name>instance-0000001f</name>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-207746804</nova:name>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:29</nova:creationTime>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        <nova:port uuid="747cab40-fbad-4008-a7ac-6cf1f12b6ee4">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="serial">0e693c72-183a-4005-8891-207b95ad22b1</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="uuid">0e693c72-183a-4005-8891-207b95ad22b1</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:99:de:f3"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <target dev="tap747cab40-fb"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/console.log" append="off"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:29 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:29 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:29 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:29 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.475 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Preparing to wait for external event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.475 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.476 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.476 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.476 183134 DEBUG nova.virt.libvirt.vif [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:23Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.477 183134 DEBUG nova.network.os_vif_util [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.477 183134 DEBUG nova.network.os_vif_util [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.477 183134 DEBUG os_vif [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.478 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.478 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.478 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.481 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.481 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap747cab40-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.481 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap747cab40-fb, col_values=(('external_ids', {'iface-id': '747cab40-fbad-4008-a7ac-6cf1f12b6ee4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:de:f3', 'vm-uuid': '0e693c72-183a-4005-8891-207b95ad22b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.483 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:29 np0005601977 NetworkManager[55565]: <info>  [1769765609.4842] manager: (tap747cab40-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.485 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.489 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.490 183134 INFO os_vif [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb')#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.537 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.538 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.538 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:99:de:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.539 183134 INFO nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Using config drive#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.945 183134 INFO nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating config drive at /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config#033[00m
Jan 30 04:33:29 np0005601977 nova_compute[183130]: 2026-01-30 09:33:29.950 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyovup436 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.065 183134 DEBUG oslo_concurrency.processutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyovup436" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:30 np0005601977 kernel: tap747cab40-fb: entered promiscuous mode
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.1210] manager: (tap747cab40-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Jan 30 04:33:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:30Z|00284|binding|INFO|Claiming lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for this chassis.
Jan 30 04:33:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:30Z|00285|binding|INFO|747cab40-fbad-4008-a7ac-6cf1f12b6ee4: Claiming fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.123 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:30Z|00286|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 ovn-installed in OVS
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.128 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.129 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.131 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:de:f3 10.100.0.12'], port_security=['fa:16:3e:99:de:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0e693c72-183a-4005-8891-207b95ad22b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '90463a7d-c3a0-4624-975d-0cc4b6ff9814', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a98fd02a-19ea-434b-9ec2-1fdf64f82e5f, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=747cab40-fbad-4008-a7ac-6cf1f12b6ee4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:30Z|00287|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 up in Southbound
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.132 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.133 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 in datapath b2ca1571-8ba0-4f98-bb63-cbd6ba450393 bound to our chassis#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.135 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2ca1571-8ba0-4f98-bb63-cbd6ba450393#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.145 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[508200d3-fd78-43d2-8fc6-8df537a20147]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.145 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2ca1571-81 in ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.147 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2ca1571-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.147 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[30a3c81c-b678-4a30-a554-9b9b1dd390e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.148 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bfbc8b-82a5-4c73-804a-d6a8d0cf4b2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 systemd-machined[154431]: New machine qemu-24-instance-0000001f.
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.157 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[879f4bf8-1a83-41d7-95bb-3781827d4365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 systemd[1]: Started Virtual Machine qemu-24-instance-0000001f.
Jan 30 04:33:30 np0005601977 systemd-udevd[219730]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.183 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4bacbf98-aa44-4efe-b7da-7cdf1216cd87]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.1924] device (tap747cab40-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.1932] device (tap747cab40-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.208 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3f7546-e5ac-484a-86e9-79bbde6a143d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.212 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[27240dbb-7fb6-47ce-a516-266dbcf24484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.2133] manager: (tapb2ca1571-80): new Veth device (/org/freedesktop/NetworkManager/Devices/122)
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.239 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9cef6013-f0c2-4fd9-ab71-8627dda73e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.243 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ed44a8-9490-442c-b253-1153aff25a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.2600] device (tapb2ca1571-80): carrier: link connected
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.262 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[46f2329c-026a-4423-a249-f2509f0301ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.276 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7b2fde00-822d-40b4-96c0-1e7eaa6ec7b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ca1571-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:1f:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414160, 'reachable_time': 24767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219760, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.288 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[850f78dc-6978-4599-8ca0-8edde32725b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:1f08'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414160, 'tstamp': 414160}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219761, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.301 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bc61c873-817f-4fd3-95c5-b7d50b318cdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ca1571-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:1f:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414160, 'reachable_time': 24767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219762, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.325 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d23bce-b892-42d7-b181-248a416fea98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.366 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cc589c-f421-4d8d-bb7b-07bf452ab320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ca1571-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2ca1571-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.370 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 NetworkManager[55565]: <info>  [1769765610.3710] manager: (tapb2ca1571-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 30 04:33:30 np0005601977 kernel: tapb2ca1571-80: entered promiscuous mode
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.377 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2ca1571-80, col_values=(('external_ids', {'iface-id': '92996e6c-be8d-4868-a92b-0dd619c09c89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.378 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:30Z|00288|binding|INFO|Releasing lport 92996e6c-be8d-4868-a92b-0dd619c09c89 from this chassis (sb_readonly=0)
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.382 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.382 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.383 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[32f2a3c1-d0f0-472b-a18f-4b5571efc02e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.384 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b2ca1571-8ba0-4f98-bb63-cbd6ba450393
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b2ca1571-8ba0-4f98-bb63-cbd6ba450393
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:33:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:30.386 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'env', 'PROCESS_TAG=haproxy-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.636 183134 DEBUG nova.compute.manager [req-e69584d3-c0ef-4724-a791-14762f870a74 req-ee860eb0-140d-4830-8ec9-8a3b4dc9ce3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.638 183134 DEBUG oslo_concurrency.lockutils [req-e69584d3-c0ef-4724-a791-14762f870a74 req-ee860eb0-140d-4830-8ec9-8a3b4dc9ce3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.639 183134 DEBUG oslo_concurrency.lockutils [req-e69584d3-c0ef-4724-a791-14762f870a74 req-ee860eb0-140d-4830-8ec9-8a3b4dc9ce3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.640 183134 DEBUG oslo_concurrency.lockutils [req-e69584d3-c0ef-4724-a791-14762f870a74 req-ee860eb0-140d-4830-8ec9-8a3b4dc9ce3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.640 183134 DEBUG nova.compute.manager [req-e69584d3-c0ef-4724-a791-14762f870a74 req-ee860eb0-140d-4830-8ec9-8a3b4dc9ce3b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Processing event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.645 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765610.6441214, 0e693c72-183a-4005-8891-207b95ad22b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.645 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.648 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.652 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.655 183134 INFO nova.virt.libvirt.driver [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance spawned successfully.#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.656 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.674 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.677 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.688 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.688 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.689 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.689 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.689 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.690 183134 DEBUG nova.virt.libvirt.driver [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.709 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.709 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765610.6475508, 0e693c72-183a-4005-8891-207b95ad22b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.709 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:30 np0005601977 podman[219801]: 2026-01-30 09:33:30.72728516 +0000 UTC m=+0.078503243 container create 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:33:30 np0005601977 systemd[1]: Started libpod-conmon-0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a.scope.
Jan 30 04:33:30 np0005601977 podman[219801]: 2026-01-30 09:33:30.668807835 +0000 UTC m=+0.020025938 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:33:30 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:33:30 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e266e5f44c2dc59130f54a126eaa0e11e36577369a07dae7bec3786ae5c754/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:33:30 np0005601977 podman[219801]: 2026-01-30 09:33:30.815653021 +0000 UTC m=+0.166871124 container init 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:33:30 np0005601977 podman[219801]: 2026-01-30 09:33:30.823422798 +0000 UTC m=+0.174640881 container start 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:30 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [NOTICE]   (219841) : New worker (219846) forked
Jan 30 04:33:30 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [NOTICE]   (219841) : Loading success.
Jan 30 04:33:30 np0005601977 podman[219814]: 2026-01-30 09:33:30.851665046 +0000 UTC m=+0.080249913 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.853 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.859 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.860 183134 INFO nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Took 7.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.860 183134 DEBUG nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.866 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765610.651511, 0e693c72-183a-4005-8891-207b95ad22b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.867 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.901 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.905 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.951 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.967 183134 INFO nova.compute.manager [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Took 8.64 seconds to build instance.#033[00m
Jan 30 04:33:30 np0005601977 nova_compute[183130]: 2026-01-30 09:33:30.990 183134 DEBUG oslo_concurrency.lockutils [None req-a99c9bbd-825c-436a-8dc2-0b8bc31db26f 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:31 np0005601977 nova_compute[183130]: 2026-01-30 09:33:31.113 183134 DEBUG nova.network.neutron [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updated VIF entry in instance network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:31 np0005601977 nova_compute[183130]: 2026-01-30 09:33:31.113 183134 DEBUG nova.network.neutron [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updating instance_info_cache with network_info: [{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:31 np0005601977 nova_compute[183130]: 2026-01-30 09:33:31.133 183134 DEBUG oslo_concurrency.lockutils [req-064037c1-8c16-4b56-89a0-b0c4d0a849aa req-deaa9944-63c4-4ad4-b232-0f2c41ce9b73 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:31 np0005601977 nova_compute[183130]: 2026-01-30 09:33:31.972 183134 INFO nova.virt.libvirt.driver [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Snapshot image upload complete#033[00m
Jan 30 04:33:31 np0005601977 nova_compute[183130]: 2026-01-30 09:33:31.973 183134 INFO nova.compute.manager [None req-4e0b2c2f-2a9a-4916-89b7-b9349f963f88 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Took 4.84 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.962 183134 DEBUG nova.compute.manager [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.962 183134 DEBUG oslo_concurrency.lockutils [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.963 183134 DEBUG oslo_concurrency.lockutils [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.963 183134 DEBUG oslo_concurrency.lockutils [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.963 183134 DEBUG nova.compute.manager [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:32 np0005601977 nova_compute[183130]: 2026-01-30 09:33:32.963 183134 WARNING nova.compute.manager [req-8c0562c0-a619-480c-aa71-1428dea73ede req-65939dfb-937b-4168-bf50-543484fdd697 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:33:34 np0005601977 nova_compute[183130]: 2026-01-30 09:33:34.482 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.115 183134 DEBUG nova.compute.manager [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.115 183134 DEBUG nova.compute.manager [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing instance network info cache due to event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.116 183134 DEBUG oslo_concurrency.lockutils [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.116 183134 DEBUG oslo_concurrency.lockutils [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.116 183134 DEBUG nova.network.neutron [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:35 np0005601977 nova_compute[183130]: 2026-01-30 09:33:35.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.037 183134 DEBUG nova.compute.manager [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-changed-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.037 183134 DEBUG nova.compute.manager [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing instance network info cache due to event network-changed-3e8e7ac3-7773-46da-922a-c24dce47f456. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.037 183134 DEBUG oslo_concurrency.lockutils [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.037 183134 DEBUG oslo_concurrency.lockutils [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.038 183134 DEBUG nova.network.neutron [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing network info cache for port 3e8e7ac3-7773-46da-922a-c24dce47f456 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.606 183134 DEBUG nova.network.neutron [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updated VIF entry in instance network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.607 183134 DEBUG nova.network.neutron [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updating instance_info_cache with network_info: [{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:37 np0005601977 nova_compute[183130]: 2026-01-30 09:33:37.656 183134 DEBUG oslo_concurrency.lockutils [req-2e1839bd-6656-4fb0-b856-eaa22f688231 req-9cbbfe90-5ef4-47ea-9bc9-44582e30b215 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.331 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.331 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.359 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.469 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.469 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.476 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.477 183134 INFO nova.compute.claims [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.717 183134 DEBUG nova.compute.provider_tree [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.736 183134 DEBUG nova.scheduler.client.report [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.782 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.782 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.844 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.844 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.876 183134 INFO nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:33:38 np0005601977 nova_compute[183130]: 2026-01-30 09:33:38.903 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.012 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.014 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.014 183134 INFO nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Creating image(s)#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.015 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.015 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.016 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.016 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.017 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.094 183134 DEBUG nova.policy [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.435 183134 DEBUG nova.network.neutron [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated VIF entry in instance network info cache for port 3e8e7ac3-7773-46da-922a-c24dce47f456. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.436 183134 DEBUG nova.network.neutron [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.469 183134 DEBUG oslo_concurrency.lockutils [req-9d5b4c0a-55e9-4d7a-9644-ae7cb1daaa8d req-554b82df-eabb-486a-a7cd-47e866a6067e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:39 np0005601977 nova_compute[183130]: 2026-01-30 09:33:39.484 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:39 np0005601977 podman[219860]: 2026-01-30 09:33:39.845211533 +0000 UTC m=+0.058568603 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:33:40 np0005601977 nova_compute[183130]: 2026-01-30 09:33:40.732 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:40 np0005601977 nova_compute[183130]: 2026-01-30 09:33:40.922 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Successfully created port: 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.144 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.145 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.179 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.198 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.238 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.239 183134 DEBUG nova.virt.images [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] a6e939c4-3cd5-464f-b227-1809e53fe850 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.240 183134 DEBUG nova.privsep.utils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.241 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.part /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.288 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.288 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.296 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.297 183134 INFO nova.compute.claims [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.472 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.part /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.converted" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.482 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.563 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4.converted --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.564 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.599 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.641 183134 DEBUG nova.compute.provider_tree [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.664 183134 DEBUG nova.scheduler.client.report [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.670 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.671 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.672 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.689 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.706 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.707 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.745 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.746 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4,backing_fmt=raw /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.773 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.773 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.779 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4,backing_fmt=raw /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.780 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "f3bdd19f58c6dd32802b100d2363d205d4b05be4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.780 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.800 183134 INFO nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.828 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.838 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.839 183134 DEBUG nova.objects.instance [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'migration_context' on Instance uuid 6a7e9f4f-a651-4817-a679-b45828fcf5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.859 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Successfully updated port: 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.884 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.884 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Ensure instance console log exists: /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.885 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.885 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.885 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.890 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.890 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquired lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.890 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.952 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.953 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.954 183134 INFO nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Creating image(s)#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.955 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.955 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.956 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:41 np0005601977 nova_compute[183130]: 2026-01-30 09:33:41.975 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.034 183134 DEBUG nova.policy [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.038 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.039 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.040 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.050 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.128 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.129 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.155 183134 DEBUG nova.compute.manager [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.155 183134 DEBUG nova.compute.manager [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing instance network info cache due to event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.156 183134 DEBUG oslo_concurrency.lockutils [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.157 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.173 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.174 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.174 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.238 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.239 183134 DEBUG nova.virt.disk.api [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Checking if we can resize image /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.240 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.296 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.297 183134 DEBUG nova.virt.disk.api [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Cannot resize image /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.297 183134 DEBUG nova.objects.instance [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'migration_context' on Instance uuid 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.313 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.314 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Ensure instance console log exists: /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.314 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.314 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:42 np0005601977 nova_compute[183130]: 2026-01-30 09:33:42.315 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:43 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:43Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:33:43 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:43Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.759 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Successfully created port: 6b04d832-453a-4046-a311-7f401c10412f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.811 183134 DEBUG nova.network.neutron [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updating instance_info_cache with network_info: [{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.846 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Releasing lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.847 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Instance network_info: |[{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.849 183134 DEBUG oslo_concurrency.lockutils [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.849 183134 DEBUG nova.network.neutron [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.856 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Start _get_guest_xml network_info=[{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='6c8efdc86387b3a17547112c1c8d0509',container_format='bare',created_at=2026-01-30T09:33:26Z,direct_url=<?>,disk_format='qcow2',id=a6e939c4-3cd5-464f-b227-1809e53fe850,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-462752170',owner='8960c51c5e7f4c65928b539d6bd01b08',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-30T09:33:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.864 183134 WARNING nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.877 183134 DEBUG nova.virt.libvirt.host [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.878 183134 DEBUG nova.virt.libvirt.host [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.882 183134 DEBUG nova.virt.libvirt.host [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.883 183134 DEBUG nova.virt.libvirt.host [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.885 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.886 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='6c8efdc86387b3a17547112c1c8d0509',container_format='bare',created_at=2026-01-30T09:33:26Z,direct_url=<?>,disk_format='qcow2',id=a6e939c4-3cd5-464f-b227-1809e53fe850,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-462752170',owner='8960c51c5e7f4c65928b539d6bd01b08',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-30T09:33:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.887 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.887 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.888 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.888 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.889 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.889 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.890 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.890 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.890 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.891 183134 DEBUG nova.virt.hardware [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.899 183134 DEBUG nova.virt.libvirt.vif [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-85653886',display_name='tempest-TestSnapshotPattern-server-85653886',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-85653886',id=32,image_ref='a6e939c4-3cd5-464f-b227-1809e53fe850',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-y5fo9o9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='8aafaddd-1368-427e-8596-2b5871053f79',image_min_disk='1',image_min_ram='0',image_owner_id='8960c51c5e7f4c65928b539d6bd01b08',image_owner_project_name='tempest-TestSnapshotPattern-1319331586',image_owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member',image_user_id='7701defc672143599a29756b7b25b4dc',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:38Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=6a7e9f4f-a651-4817-a679-b45828fcf5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.900 183134 DEBUG nova.network.os_vif_util [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.901 183134 DEBUG nova.network.os_vif_util [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.903 183134 DEBUG nova.objects.instance [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a7e9f4f-a651-4817-a679-b45828fcf5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.920 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <uuid>6a7e9f4f-a651-4817-a679-b45828fcf5af</uuid>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <name>instance-00000020</name>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestSnapshotPattern-server-85653886</nova:name>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:43</nova:creationTime>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:user uuid="7701defc672143599a29756b7b25b4dc">tempest-TestSnapshotPattern-1319331586-project-member</nova:user>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:project uuid="8960c51c5e7f4c65928b539d6bd01b08">tempest-TestSnapshotPattern-1319331586</nova:project>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="a6e939c4-3cd5-464f-b227-1809e53fe850"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        <nova:port uuid="4ebb9e31-7061-4ecc-9cbf-98143a8361e4">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="serial">6a7e9f4f-a651-4817-a679-b45828fcf5af</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="uuid">6a7e9f4f-a651-4817-a679-b45828fcf5af</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.config"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:52:a5:03"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <target dev="tap4ebb9e31-70"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/console.log" append="off"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <input type="keyboard" bus="usb"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:43 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:43 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:43 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:43 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.920 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Preparing to wait for external event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.920 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.921 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.921 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.922 183134 DEBUG nova.virt.libvirt.vif [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-85653886',display_name='tempest-TestSnapshotPattern-server-85653886',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-85653886',id=32,image_ref='a6e939c4-3cd5-464f-b227-1809e53fe850',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-y5fo9o9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='8aafaddd-1368-427e-8596-2b5871053f79',image_min_disk='1',image_min_ram='0',image_owner_id='8960c51c5e7f4c65928b539d6bd01b08',image_owner_project_name='tempest-TestSnapshotPattern-1319331586',image_owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member',image_user_id='7701defc672143599a29756b7b25b4dc',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:38Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=6a7e9f4f-a651-4817-a679-b45828fcf5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.922 183134 DEBUG nova.network.os_vif_util [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.923 183134 DEBUG nova.network.os_vif_util [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.923 183134 DEBUG os_vif [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.924 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.924 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.925 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.928 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.928 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ebb9e31-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.929 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ebb9e31-70, col_values=(('external_ids', {'iface-id': '4ebb9e31-7061-4ecc-9cbf-98143a8361e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:a5:03', 'vm-uuid': '6a7e9f4f-a651-4817-a679-b45828fcf5af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.931 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:43 np0005601977 NetworkManager[55565]: <info>  [1769765623.9322] manager: (tap4ebb9e31-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.936 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.938 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:43 np0005601977 nova_compute[183130]: 2026-01-30 09:33:43.939 183134 INFO os_vif [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70')#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.014 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.015 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.015 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] No VIF found with MAC fa:16:3e:52:a5:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.016 183134 INFO nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Using config drive#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.664 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:44.664 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:44.667 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.839 183134 INFO nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Creating config drive at /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.config#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.843 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4oh0juh1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:44 np0005601977 nova_compute[183130]: 2026-01-30 09:33:44.961 183134 DEBUG oslo_concurrency.processutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4oh0juh1" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:45 np0005601977 NetworkManager[55565]: <info>  [1769765625.0207] manager: (tap4ebb9e31-70): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 30 04:33:45 np0005601977 kernel: tap4ebb9e31-70: entered promiscuous mode
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.024 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:45Z|00289|binding|INFO|Claiming lport 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 for this chassis.
Jan 30 04:33:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:45Z|00290|binding|INFO|4ebb9e31-7061-4ecc-9cbf-98143a8361e4: Claiming fa:16:3e:52:a5:03 10.100.0.8
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.037 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:a5:03 10.100.0.8'], port_security=['fa:16:3e:52:a5:03 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8980838c-37f7-45e5-9084-1321907354d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc00530c-da00-4b1f-8544-f4f16829e051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4c33c50-4f9e-4c9d-ac8f-b1ee6c0d33bf, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4ebb9e31-7061-4ecc-9cbf-98143a8361e4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.038 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 in datapath 8980838c-37f7-45e5-9084-1321907354d2 bound to our chassis#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.040 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8980838c-37f7-45e5-9084-1321907354d2#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.040 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:45Z|00291|binding|INFO|Setting lport 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 ovn-installed in OVS
Jan 30 04:33:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:45Z|00292|binding|INFO|Setting lport 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 up in Southbound
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.046 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.047 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.057 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7d11bf-5844-455c-aa48-13b1e22fdcfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 systemd-udevd[219961]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:45 np0005601977 systemd-machined[154431]: New machine qemu-25-instance-00000020.
Jan 30 04:33:45 np0005601977 NetworkManager[55565]: <info>  [1769765625.0729] device (tap4ebb9e31-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:45 np0005601977 NetworkManager[55565]: <info>  [1769765625.0735] device (tap4ebb9e31-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:45 np0005601977 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.090 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffdeb9f-ff2c-44b1-b52f-1ded68def7b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.093 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[67630133-c0c1-4a44-ac77-45f8d795add3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.108 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[79cbf2df-e3c9-4bdf-9f91-b57ac884d1f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.110 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Successfully updated port: 6b04d832-453a-4046-a311-7f401c10412f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.119 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c48fc3ab-2889-4b7c-918c-a73b546147cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8980838c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:7d:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411642, 'reachable_time': 33905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219973, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.125 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.126 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquired lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.126 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.132 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7a52e2a2-fd83-420f-89e1-4526cd21abbb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8980838c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411649, 'tstamp': 411649}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219974, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8980838c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411650, 'tstamp': 411650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219974, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.135 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8980838c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.136 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.137 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.137 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8980838c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.138 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.138 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8980838c-30, col_values=(('external_ids', {'iface-id': '50e26df2-7d93-4204-9b22-94b2140c0f47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:45.139 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.200 183134 DEBUG nova.compute.manager [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-changed-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.201 183134 DEBUG nova.compute.manager [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Refreshing instance network info cache due to event network-changed-6b04d832-453a-4046-a311-7f401c10412f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.201 183134 DEBUG oslo_concurrency.lockutils [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.273 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.298 183134 DEBUG nova.compute.manager [req-0258f176-fbcc-4ba8-ae84-ea4e85167b74 req-0f28db20-7df1-40ed-ba95-8e699f953e38 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.298 183134 DEBUG oslo_concurrency.lockutils [req-0258f176-fbcc-4ba8-ae84-ea4e85167b74 req-0f28db20-7df1-40ed-ba95-8e699f953e38 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.299 183134 DEBUG oslo_concurrency.lockutils [req-0258f176-fbcc-4ba8-ae84-ea4e85167b74 req-0f28db20-7df1-40ed-ba95-8e699f953e38 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.299 183134 DEBUG oslo_concurrency.lockutils [req-0258f176-fbcc-4ba8-ae84-ea4e85167b74 req-0f28db20-7df1-40ed-ba95-8e699f953e38 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.299 183134 DEBUG nova.compute.manager [req-0258f176-fbcc-4ba8-ae84-ea4e85167b74 req-0f28db20-7df1-40ed-ba95-8e699f953e38 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Processing event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.534 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765625.533619, 6a7e9f4f-a651-4817-a679-b45828fcf5af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.534 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.538 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.544 183134 DEBUG nova.virt.libvirt.driver [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.548 183134 INFO nova.virt.libvirt.driver [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Instance spawned successfully.#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.549 183134 INFO nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Took 6.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.549 183134 DEBUG nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.565 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.569 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.596 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.596 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765625.5338204, 6a7e9f4f-a651-4817-a679-b45828fcf5af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.597 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.638 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.641 183134 INFO nova.compute.manager [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Took 7.20 seconds to build instance.#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.644 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765625.5415845, 6a7e9f4f-a651-4817-a679-b45828fcf5af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.644 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.665 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.665 183134 DEBUG oslo_concurrency.lockutils [None req-97eb2b36-c4f6-43d0-ad30-7ec9e1a1144c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.669 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.732 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.973 183134 DEBUG nova.network.neutron [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updated VIF entry in instance network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.973 183134 DEBUG nova.network.neutron [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updating instance_info_cache with network_info: [{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:45 np0005601977 nova_compute[183130]: 2026-01-30 09:33:45.995 183134 DEBUG oslo_concurrency.lockutils [req-80857bdf-e26d-4f1d-935d-e19b180ea81f req-6674fba1-43a8-4262-ac19-4267c02e6aa2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.404 183134 DEBUG nova.network.neutron [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updating instance_info_cache with network_info: [{"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.433 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Releasing lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.433 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Instance network_info: |[{"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.434 183134 DEBUG oslo_concurrency.lockutils [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.434 183134 DEBUG nova.network.neutron [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Refreshing network info cache for port 6b04d832-453a-4046-a311-7f401c10412f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.438 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Start _get_guest_xml network_info=[{"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.443 183134 WARNING nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.458 183134 DEBUG nova.virt.libvirt.host [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.459 183134 DEBUG nova.virt.libvirt.host [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.464 183134 DEBUG nova.virt.libvirt.host [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.464 183134 DEBUG nova.virt.libvirt.host [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.465 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.466 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.466 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.466 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.467 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.467 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.467 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.467 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.468 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.468 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.468 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.468 183134 DEBUG nova.virt.hardware [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.472 183134 DEBUG nova.virt.libvirt.vif [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=33,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-j5kq0qm1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:41Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=12406b2c-7c9c-41b8-b0c7-30bf4455b4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.473 183134 DEBUG nova.network.os_vif_util [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.474 183134 DEBUG nova.network.os_vif_util [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.475 183134 DEBUG nova.objects.instance [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'pci_devices' on Instance uuid 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.495 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <uuid>12406b2c-7c9c-41b8-b0c7-30bf4455b4a9</uuid>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <name>instance-00000021</name>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652</nova:name>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:46</nova:creationTime>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:user uuid="594b0fd4bee7455ab5aac7774bd07b70">tempest-TestSecurityGroupsBasicOps-2060529369-project-member</nova:user>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:project uuid="396e2944b44f42e59b102db87e2e060c">tempest-TestSecurityGroupsBasicOps-2060529369</nova:project>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        <nova:port uuid="6b04d832-453a-4046-a311-7f401c10412f">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="serial">12406b2c-7c9c-41b8-b0c7-30bf4455b4a9</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="uuid">12406b2c-7c9c-41b8-b0c7-30bf4455b4a9</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.config"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:9d:42:f4"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <target dev="tap6b04d832-45"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/console.log" append="off"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:46 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:46 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:46 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:46 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.496 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Preparing to wait for external event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.496 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.496 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.497 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.497 183134 DEBUG nova.virt.libvirt.vif [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:33:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=33,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-j5kq0qm1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:41Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=12406b2c-7c9c-41b8-b0c7-30bf4455b4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.498 183134 DEBUG nova.network.os_vif_util [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.499 183134 DEBUG nova.network.os_vif_util [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.499 183134 DEBUG os_vif [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.500 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.500 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.500 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.503 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.504 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b04d832-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.504 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b04d832-45, col_values=(('external_ids', {'iface-id': '6b04d832-453a-4046-a311-7f401c10412f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:42:f4', 'vm-uuid': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.506 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:46 np0005601977 NetworkManager[55565]: <info>  [1769765626.5076] manager: (tap6b04d832-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.508 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.513 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.514 183134 INFO os_vif [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45')#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.587 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.587 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.588 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] No VIF found with MAC fa:16:3e:9d:42:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:46 np0005601977 nova_compute[183130]: 2026-01-30 09:33:46.588 183134 INFO nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Using config drive#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.005 183134 INFO nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Creating config drive at /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.config#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.011 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqve33sr1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.133 183134 DEBUG oslo_concurrency.processutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqve33sr1" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:47 np0005601977 kernel: tap6b04d832-45: entered promiscuous mode
Jan 30 04:33:47 np0005601977 NetworkManager[55565]: <info>  [1769765627.1917] manager: (tap6b04d832-45): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Jan 30 04:33:47 np0005601977 systemd-udevd[219965]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.194 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:47Z|00293|binding|INFO|Claiming lport 6b04d832-453a-4046-a311-7f401c10412f for this chassis.
Jan 30 04:33:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:47Z|00294|binding|INFO|6b04d832-453a-4046-a311-7f401c10412f: Claiming fa:16:3e:9d:42:f4 10.100.0.8
Jan 30 04:33:47 np0005601977 NetworkManager[55565]: <info>  [1769765627.2017] device (tap6b04d832-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:47 np0005601977 NetworkManager[55565]: <info>  [1769765627.2023] device (tap6b04d832-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.209 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:42:f4 10.100.0.8'], port_security=['fa:16:3e:9d:42:f4 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-632dc37f-a471-48f7-998e-601c234d5eea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b89a168-ee83-4ac9-852d-dbd31b3e41f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3c73d5-3fb7-4892-bbfe-678dc6ae4603, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6b04d832-453a-4046-a311-7f401c10412f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.210 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6b04d832-453a-4046-a311-7f401c10412f in datapath 632dc37f-a471-48f7-998e-601c234d5eea bound to our chassis#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.212 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 632dc37f-a471-48f7-998e-601c234d5eea#033[00m
Jan 30 04:33:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:47Z|00295|binding|INFO|Setting lport 6b04d832-453a-4046-a311-7f401c10412f ovn-installed in OVS
Jan 30 04:33:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:47Z|00296|binding|INFO|Setting lport 6b04d832-453a-4046-a311-7f401c10412f up in Southbound
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.216 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.219 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:47 np0005601977 systemd-machined[154431]: New machine qemu-26-instance-00000021.
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.227 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[22d2247a-892c-4db5-b423-4035d5a606a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 systemd[1]: Started Virtual Machine qemu-26-instance-00000021.
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.248 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[fa462fe8-534f-4e92-9697-adb28d863584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.252 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a442d8-19b7-4ab7-84b9-52671476b9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.276 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[23adfd71-8804-439e-84af-13901d5fd2bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.306 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9eacf872-7a94-40f7-b204-b50541a4fd54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap632dc37f-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:e5:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411455, 'reachable_time': 25559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220014, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.321 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a7efc797-6b21-4e9b-af8a-8bdca014fd7c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap632dc37f-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411464, 'tstamp': 411464}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220016, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap632dc37f-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411466, 'tstamp': 411466}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220016, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.323 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap632dc37f-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.324 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.326 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.326 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap632dc37f-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.326 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.327 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap632dc37f-a0, col_values=(('external_ids', {'iface-id': '13570b6a-d879-43dc-b830-8118569a82b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:47.328 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.417 183134 DEBUG nova.compute.manager [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.418 183134 DEBUG oslo_concurrency.lockutils [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.418 183134 DEBUG oslo_concurrency.lockutils [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.419 183134 DEBUG oslo_concurrency.lockutils [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.419 183134 DEBUG nova.compute.manager [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] No waiting events found dispatching network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.419 183134 WARNING nova.compute.manager [req-b4f5445b-8c03-4c15-961a-bc72bc1e4f87 req-83ce6f4e-7115-4da4-badd-a98db7a8ea12 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received unexpected event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.472 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765627.4724536, 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.473 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.501 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.508 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765627.4749103, 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.509 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.531 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.536 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.558 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.918 183134 DEBUG nova.network.neutron [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updated VIF entry in instance network info cache for port 6b04d832-453a-4046-a311-7f401c10412f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.919 183134 DEBUG nova.network.neutron [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updating instance_info_cache with network_info: [{"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:47 np0005601977 nova_compute[183130]: 2026-01-30 09:33:47.952 183134 DEBUG oslo_concurrency.lockutils [req-8b19d8c1-62b2-4771-a66f-084d6c06010b req-3620f575-708e-45d9-bbd2-e53727738e84 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:48 np0005601977 podman[220025]: 2026-01-30 09:33:48.851867136 +0000 UTC m=+0.057036360 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:48 np0005601977 podman[220024]: 2026-01-30 09:33:48.855579713 +0000 UTC m=+0.061162049 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.515 183134 DEBUG nova.compute.manager [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.515 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.516 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.516 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.516 183134 DEBUG nova.compute.manager [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Processing event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.517 183134 DEBUG nova.compute.manager [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.517 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.517 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.517 183134 DEBUG oslo_concurrency.lockutils [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.518 183134 DEBUG nova.compute.manager [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] No waiting events found dispatching network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.518 183134 WARNING nova.compute.manager [req-a99a9c24-7bed-4ea6-875b-7831de6c4064 req-b6d0d4f4-aaea-4fb9-88cf-fbc53b87a3ef dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received unexpected event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.519 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.541 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765629.5306408, 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.541 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.542 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.546 183134 INFO nova.virt.libvirt.driver [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Instance spawned successfully.#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.546 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.571 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.577 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.580 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.580 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.581 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.581 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.582 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.582 183134 DEBUG nova.virt.libvirt.driver [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.624 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.660 183134 INFO nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Took 7.71 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.660 183134 DEBUG nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.724 183134 INFO nova.compute.manager [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Took 8.46 seconds to build instance.#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.733 183134 INFO nova.compute.manager [None req-47e73310-5948-432b-b663-4bba579b62b8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Get console output#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.739 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:33:49 np0005601977 nova_compute[183130]: 2026-01-30 09:33:49.750 183134 DEBUG oslo_concurrency.lockutils [None req-4c11e26c-1ad2-407a-91c6-d9e14c638d17 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:50 np0005601977 nova_compute[183130]: 2026-01-30 09:33:50.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:50 np0005601977 nova_compute[183130]: 2026-01-30 09:33:50.851 183134 INFO nova.compute.manager [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Rebuilding instance#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.161 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.185 183134 DEBUG nova.compute.manager [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.247 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_requests' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.259 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.275 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.288 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.299 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.302 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.508 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.624 183134 DEBUG nova.compute.manager [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.625 183134 DEBUG nova.compute.manager [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing instance network info cache due to event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.625 183134 DEBUG oslo_concurrency.lockutils [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.625 183134 DEBUG oslo_concurrency.lockutils [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:51 np0005601977 nova_compute[183130]: 2026-01-30 09:33:51.625 183134 DEBUG nova.network.neutron [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:52.669 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:53 np0005601977 kernel: tap747cab40-fb (unregistering): left promiscuous mode
Jan 30 04:33:53 np0005601977 NetworkManager[55565]: <info>  [1769765633.4844] device (tap747cab40-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:33:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:53Z|00297|binding|INFO|Releasing lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 from this chassis (sb_readonly=0)
Jan 30 04:33:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:53Z|00298|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 down in Southbound
Jan 30 04:33:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:53Z|00299|binding|INFO|Removing iface tap747cab40-fb ovn-installed in OVS
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.495 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.501 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.509 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.510 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:de:f3 10.100.0.12'], port_security=['fa:16:3e:99:de:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0e693c72-183a-4005-8891-207b95ad22b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '90463a7d-c3a0-4624-975d-0cc4b6ff9814', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.244'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a98fd02a-19ea-434b-9ec2-1fdf64f82e5f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=747cab40-fbad-4008-a7ac-6cf1f12b6ee4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.513 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 in datapath b2ca1571-8ba0-4f98-bb63-cbd6ba450393 unbound from our chassis#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.518 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2ca1571-8ba0-4f98-bb63-cbd6ba450393, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.519 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[48a02cb8-0059-436b-a1c7-7b4987942b6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.520 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 namespace which is not needed anymore#033[00m
Jan 30 04:33:53 np0005601977 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 30 04:33:53 np0005601977 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001f.scope: Consumed 12.708s CPU time.
Jan 30 04:33:53 np0005601977 systemd-machined[154431]: Machine qemu-24-instance-0000001f terminated.
Jan 30 04:33:53 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [NOTICE]   (219841) : haproxy version is 2.8.14-c23fe91
Jan 30 04:33:53 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [NOTICE]   (219841) : path to executable is /usr/sbin/haproxy
Jan 30 04:33:53 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [WARNING]  (219841) : Exiting Master process...
Jan 30 04:33:53 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [ALERT]    (219841) : Current worker (219846) exited with code 143 (Terminated)
Jan 30 04:33:53 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[219817]: [WARNING]  (219841) : All workers exited. Exiting... (0)
Jan 30 04:33:53 np0005601977 systemd[1]: libpod-0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a.scope: Deactivated successfully.
Jan 30 04:33:53 np0005601977 podman[220087]: 2026-01-30 09:33:53.654486428 +0000 UTC m=+0.046937068 container died 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:33:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a-userdata-shm.mount: Deactivated successfully.
Jan 30 04:33:53 np0005601977 systemd[1]: var-lib-containers-storage-overlay-14e266e5f44c2dc59130f54a126eaa0e11e36577369a07dae7bec3786ae5c754-merged.mount: Deactivated successfully.
Jan 30 04:33:53 np0005601977 podman[220087]: 2026-01-30 09:33:53.699402376 +0000 UTC m=+0.091853016 container cleanup 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:33:53 np0005601977 systemd[1]: libpod-conmon-0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a.scope: Deactivated successfully.
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.718 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.722 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.745 183134 DEBUG nova.network.neutron [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updated VIF entry in instance network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.746 183134 DEBUG nova.network.neutron [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updating instance_info_cache with network_info: [{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:53 np0005601977 podman[220115]: 2026-01-30 09:33:53.756912017 +0000 UTC m=+0.042924381 container remove 0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.759 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e3971ea0-30a5-4102-b369-39c0dfaf63f0]: (4, ('Fri Jan 30 09:33:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 (0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a)\n0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a\nFri Jan 30 09:33:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 (0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a)\n0169a435c2d5f670a756a29d4f6de4b637706451f94fe9c23d96b7214e08f71a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.761 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[09b46315-6084-4121-a56b-2bd952a28ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.764 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ca1571-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:53 np0005601977 kernel: tapb2ca1571-80: left promiscuous mode
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.768 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.774 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.782 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[42e789e7-291f-4530-9a3e-6f6d2129171f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.805 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7b28bda1-4841-47d7-999a-f1fd9d266b0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.806 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7789580c-e570-47b1-b07e-311d4a94b361]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.818 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab90791-ea3b-4f06-9d79-108149abaa65]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414154, 'reachable_time': 16824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220148, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.820 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:33:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:53.820 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff171dc-9641-4afb-8d62-2816b2099172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:53 np0005601977 systemd[1]: run-netns-ovnmeta\x2db2ca1571\x2d8ba0\x2d4f98\x2dbb63\x2dcbd6ba450393.mount: Deactivated successfully.
Jan 30 04:33:53 np0005601977 nova_compute[183130]: 2026-01-30 09:33:53.828 183134 DEBUG oslo_concurrency.lockutils [req-6b3e9cc9-fb1e-489a-84c6-05daa530b9ef req-a1dfc2c0-cf08-45b2-94cb-e58daff1dbcd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.045 183134 DEBUG nova.compute.manager [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.046 183134 DEBUG oslo_concurrency.lockutils [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.046 183134 DEBUG oslo_concurrency.lockutils [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.046 183134 DEBUG oslo_concurrency.lockutils [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.047 183134 DEBUG nova.compute.manager [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.047 183134 WARNING nova.compute.manager [req-952a5610-3965-4d04-a6c6-834cbbe8ad0c req-4e7d16f0-11ab-4861-9bce-31bd21e52884 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.317 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance shutdown successfully after 3 seconds.#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.323 183134 INFO nova.virt.libvirt.driver [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance destroyed successfully.#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.329 183134 INFO nova.virt.libvirt.driver [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance destroyed successfully.#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.330 183134 DEBUG nova.virt.libvirt.vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:50Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.330 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.331 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.332 183134 DEBUG os_vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.334 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.335 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap747cab40-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.341 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.344 183134 INFO os_vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb')#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.345 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Deleting instance files /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1_del#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.345 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Deletion of /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1_del complete#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.509 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.509 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating image(s)#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.510 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.510 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.511 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.511 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.512 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.853 183134 DEBUG nova.compute.manager [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-changed-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.853 183134 DEBUG nova.compute.manager [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Refreshing instance network info cache due to event network-changed-6b04d832-453a-4046-a311-7f401c10412f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.854 183134 DEBUG oslo_concurrency.lockutils [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.854 183134 DEBUG oslo_concurrency.lockutils [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:33:54 np0005601977 nova_compute[183130]: 2026-01-30 09:33:54.855 183134 DEBUG nova.network.neutron [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Refreshing network info cache for port 6b04d832-453a-4046-a311-7f401c10412f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.451 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'name': 'tempest-TestSnapshotPattern-server-85653886', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000020', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8960c51c5e7f4c65928b539d6bd01b08', 'user_id': '7701defc672143599a29756b7b25b4dc', 'hostId': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8aafaddd-1368-427e-8596-2b5871053f79', 'name': 'tempest-TestSnapshotPattern-server-1086942880', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8960c51c5e7f4c65928b539d6bd01b08', 'user_id': '7701defc672143599a29756b7b25b4dc', 'hostId': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'name': 'tempest-TestNetworkBasicOps-server-889211547', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '33bba0bc2a744596b558c6598a1970de', 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'hostId': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.457 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '65c66677-23b6-479a-863f-3dd277183a7d', 'name': 'tempest-TestNetworkBasicOps-server-667740087', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '33bba0bc2a744596b558c6598a1970de', 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'hostId': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.459 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '396e2944b44f42e59b102db87e2e060c', 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'hostId': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.460 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '396e2944b44f42e59b102db87e2e060c', 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'hostId': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.463 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6a7e9f4f-a651-4817-a679-b45828fcf5af / tap4ebb9e31-70 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.463 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.465 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8aafaddd-1368-427e-8596-2b5871053f79 / tapc0d4f325-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.465 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.outgoing.packets volume: 59 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.467 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f / tapf469de0f-e3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.467 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f / tap3e8e7ac3-77 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.468 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets volume: 578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.468 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.469 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 65c66677-23b6-479a-863f-3dd277183a7d / tapa5afd5ba-13 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.470 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.outgoing.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.471 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aed146e3-865d-4aee-a055-42ed41e035c5 / tap8472693d-cc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.471 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.outgoing.packets volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.473 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 / tap6b04d832-45 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.473 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72a34573-5831-4aef-9453-65a4cbf3f336', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cbe14540-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '763dfe51ec87dce99597107aebf9435c910b10810f59930eecabf8e51adf5946'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 59, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cbe19ed2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '5a6fc11107f51026de1c5c60a3a2a64982fc11eec80137994381e15be248b73a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 578, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cbe1f972-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '3f02d3da7bb594ac64739c1d1f86339f71c9868bc2a9a8ee20371eb689896028'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cbe20340-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'a600d8c551e3f6d406ccb4671ae26f14ecaa7fd18da4a7350b70dbffea3adc63'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cbe242ec-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '2cf99e79d7f03973d0f31d3ff53f1da2058f5e664e9ae2e9b0445524eeea1b82'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 95, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 't
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: empest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cbe2882e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': '15c41b87e6840bfde027e7f2767215ed107e144c7507d3513d9fc81ea4c61898'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.460935', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cbe2d1bc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '08c041f68db7c6f65325c06fd49b72efc953ed9ddb7bf1298f8605d3f21b61c3'}]}, 'timestamp': '2026-01-30 09:33:55.474014', '_unique_id': 'ffbe1f3b2cf4448499f9e3c08e5f0b67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.475 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.476 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.outgoing.bytes volume: 8096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.476 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.bytes volume: 88064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.476 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.bytes volume: 4068 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.477 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.outgoing.bytes volume: 2936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.477 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.outgoing.bytes volume: 13348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.477 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94396209-9aef-4f76-b607-449df75be98b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cbe32964-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '0f74a6a2704b67510d9f78238b278661258de49f366e05d11637c11ced16089c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8096, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cbe33684-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '2889e4554ae5b6485e7391fb72d73c83799569b7e5cef3b9505a93172d982e5b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 88064, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cbe34200-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'bb0380825bdeeb8efe5b40725fb34bad12a4c82978222b810273fa512271bee8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4068, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cbe34d4a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '6f0f26aa6bc7e481e55ef5712b69497dbbe5a2045f6026e30c5527ee0b6121a5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2936, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cbe359ca-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': 'c7f66a931c5c58e194c835a045a011896fe01bf930c41ab329f9580d8b90698d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13348, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 'timestamp': '2026-01-30T09:33:55
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: ps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cbe364ec-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': 'e58f72d371f71b9227b485597f91a2dcc1cceaad91b04ad61539037c3e5f6825'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.475947', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cbe36fe6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': 'fab2293c8bca363ef101e82a8d7007030ce253bf66b39c331a609de90f2a52b8'}]}, 'timestamp': '2026-01-30 09:33:55.478065', '_unique_id': '646c2e4cffd94840ac1556d1983a3bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.500 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.501 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.526 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.requests volume: 1143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.527 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.553 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.requests volume: 1059 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.554 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.580 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.581 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.610 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.requests volume: 1078 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.610 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.642 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.642 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba020500-cf57-4927-9d6d-93e18e26d7be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbe7055c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': 'b44564fdea35c2537b9026e2ff31b8e16608503220315b1f9c793168b31b3ba3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbe716a0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '24a5fbdeebfb535ca5bb818822e628b7dfcd0bbcd58c20559489c9ab09fa92ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1143, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbeafd42-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': 'b9f520efff3c21fe9422ed7a49796486e87a62cf46c1c9f30cdde57fe63903c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbeb09e0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '61680eaea8671fd3782ffa0417c8ea4359e793e74e67371b622d40b6e377e4b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1059, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbef0cde-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': '77a83162e8cba81cbd7c45fabab672f2b5a740494b5364c6231e424143314c4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: _gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbef1b8e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'c679726d478097cf21eafad7edeac499dfd05cef02281a0708a1c3b294748d20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbf32c56-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '6e9ef6f5da709afcbc4ef411cb58aa8420a42dd4974e32e7e1e570d69af6a674'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbf33ba6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': 'f34df0610c67d214b24a0f2fd876f5c5c7e269f433413ebd56825f5cffa9adb8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1078, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbf7b4c4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '1cd44581cd46e7917489f514b136ed450d802da8f1b98a4d4f8443bfba26ed00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbf7c31a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '43b74f9f80bc1a50c5321fec6bd10b7b869dacd03aac998ff551998bd8257fc6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfc8972-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '342337f84bbb87531ed4ad975fbef30795d7ef0aad92be025e12c0a1e0066c4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.479513', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfc95e8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '10f6a5d600208acb5d7609261d549b1576dc9de4fcf5b8e009d3601c65a03108'}]}, 'timestamp': '2026-01-30 09:33:55.642873', '_unique_id': '239822b5b55e40daafc3249f53f200e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.645 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.645 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.645 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.645 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.646 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.646 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.646 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.646 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.647 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f92645b3-e9ff-440f-97a6-969ce006fda1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cbfd0956-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': 'f8f518b8cc93bb1879d408ecccd54349fca2155745623d4e7013aaba06b274f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cbfd145a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '7d792060f25ec3a1ce2f1162233d8333191d752f6ed50e63ff23ad0aa3d71a4c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cbfd1f72-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '72cb375272bdc98da7142d9d0a9da3a3e1a3c7344f24d87c8c4447a3102c3bb4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cbfd29b8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '3755fd6e3655aada51c07d13f88507f213d78c3554f6b760848532eba355bad6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cbfd33f4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '3cb7ce109f8d20f99c6fe2d54ed20ff9e20e4afdcf1edf83199c68d571526627'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: splay_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cbfd3e44-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': '9e350cebc8530c09523352dc5f43b5726366a0b67d2527b1aa496f4015ffc48c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.645529', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cbfd4a42-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '0ed1d2bed05a090390d9cd76c7c524caf2d2e5c9e555563500108d91abd3bd65'}]}, 'timestamp': '2026-01-30 09:33:55.647485', '_unique_id': '497553293a5947029c5bb62477dd5bba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.649 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.latency volume: 567818526 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.649 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.latency volume: 2717538 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.649 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.latency volume: 496205957 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.649 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.latency volume: 47909601 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.650 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.latency volume: 519207016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.650 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.latency volume: 106932520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.650 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.latency volume: 531572580 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.650 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.latency volume: 43019849 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.651 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.latency volume: 492062166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.651 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.latency volume: 45222605 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.651 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.latency volume: 369496916 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.651 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.latency volume: 3545792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdf4268b-a542-4a30-b44e-bac124fd95b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 567818526, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfd94d4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': 'b42ef8526d620ef22fbb9131e802427a01f37329fcbd3e6370576e938a3610ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2717538, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfd9f2e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '2633d41b2932a10302d14978bb2b39fb3ea2fe8d8de190f29a2811ccfd8d45fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 496205957, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfda906-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '94cc0d37ecc38e29c480270202982e6fc2aaf9d2151438651c3f1402b99c7226'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47909601, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfdb2fc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '67e27407a4c6c950b12c98e05ff8989a346c3ba473146dcedc4f04c4f6b07f7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 519207016, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfdbd7e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'a5015cd6b34eba25e7926a08d7252fe72b8b2d5e5a6d363ca2084c0ffc35f5cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106932520, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_re
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: ': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfdc71a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'd5a4ea51b7450688e678d4b84a46686981019d1a911968c5c329c2d8939c6698'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 531572580, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfdd0a2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '21303efe405b8550227bf05fafb4c445235b4787de05722d89f9b612d7d0f009'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43019849, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfdda48-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': 'b1e7ef190485b7df2d4d1c5c3731ed3703bbabf2163844d5822403fd95259260'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492062166, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfde60a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '83e21d328b48aee0d89a7c128b433d865a6e0587373a014bf8ac8873c335a2af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45222605, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfdef9c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '78874a4226e4693d01103740ebddf7e65d1b6adf6ebffa0c62946d8083c3d1dd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 369496916, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfdf906-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': 'aec619eef683f2bba0767e6413044ccc512e1e67c7c9a5d98aeaaaa9d75570d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3545792, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.649082', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-8
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfe02a2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '8c59cea11b031aefa662c3d29a19842f28e0de2cbb6e22a73d210a5aef841d58'}]}, 'timestamp': '2026-01-30 09:33:55.652216', '_unique_id': 'd288ec52c83b460885ae8f6e6e808c4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.653 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.654 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.654 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.654 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.655 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.655 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.655 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6000357f-ace5-447d-8b45-61624f3c3f8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cbfe504a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': 'e41b64844c89e0d99e3192ae093d607ef09cc28efd3b0b9b5c6f3bbbb7baf98f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cbfe5bc6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '9f206763b8b1445799fe6c13dad94a2c772cfae70d81a4703f24fe24d83168a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cbfe663e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '9116144bc367f7125ddf5328b14760522a50dcea2034c1743b0cc5179b80b7d8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cbfe711a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'ee01fc3bcb44a1789113e5a587adfbdc8597b821a9c16560c153cdbddfb5097f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cbfe7c64-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '259f0b4905ec8d7404db8a82c78ebffbfbfcb269bafbf57b19bd48f37b294319'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: splay_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cbfe868c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': '0719c14665295e094283a1526b2328a9494e2690ce08dadc392a84a9b030cb8f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.653901', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cbfe9028-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '375d78c6e34b56e3071b5c3add4a0e55587443d267420ee15cbb6e10cdbb7841'}]}, 'timestamp': '2026-01-30 09:33:55.655820', '_unique_id': 'bb565e78c96041e9a308de4de3fbfbac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.657 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.657 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.657 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.bytes volume: 73162752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.658 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.658 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.bytes volume: 73379840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.658 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.658 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.bytes volume: 72945664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.659 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.659 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.bytes volume: 72990720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.659 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.659 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.660 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e58ee96-f6f9-4165-a352-6c82326f172f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfeda60-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '0516625c9cf8be48d2bf6ee1be277ea8244c8af4458832cba275949c8f44d0d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfee41a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': 'e9ab72b40ef6307a72739b45e25d4d80a153bca282e91d1a09fdb5244a339925'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73162752, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbfeedc0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '28c71ad8319354c027860f96f415f4b8de66275fa0cf23c8c2adc237f01501bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfef8c4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '9250caa40b875a60fc189eced12edd78ebe11882b1d5f62c425e1fe6bdc49d32'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73379840, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbff0210-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': '53a5779440bae2a8258046cca35cb861be5965875f063172567569c9789e6013'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: k_name': 'sda'}, 'message_id': 'cbff0b3e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': '70a98d22444a46ae1396403ec12e63bd52bcca0c985dfcb763abb149a1895b30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72945664, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbff15b6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': 'f2eedf826097806baff6f72482f208378d4227f7c9c364ba207db4605fd08f52'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbff1fde-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '6e30079438eff36de702914ca204923745d9b770db28c698b339376a259131a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72990720, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbff2934-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': 'beaf695b07eef9980d8eca01b421159323246433f5b97f07bdef757a06095e33'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbff324e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '27dd7dae5917ac403937c6013e5ca61769954558812a0857fae0e190b20558c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cbff3b90-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': 'b279555f0db2405c1c6f1bb8391f8460e8cab61f782d0d6b563daad65d0d4f9b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.657446', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_u
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbff4572-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '0515fefb4dac19f5cd18e716e821132762301a73e3b275161961c513b0cb8bd0'}]}, 'timestamp': '2026-01-30 09:33:55.660450', '_unique_id': '24383cd572894ef8b22c915e4683a3a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.662 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.662 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.662 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.662 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.663 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.663 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.663 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a400febd-978c-447c-8162-aeeefdc9a920', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cbff9090-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '608dc4ba190adabf0d3fd84196d6307049bf84476b694dfbfeeacaae1451f9c1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cbff9b8a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': 'e1b8c91f833ca519c7704d1ada104e48bf4f07ecfcd9c55e31c711da421488c5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cbffa620-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '0ed26a5943944d7f410a80b67e8fb68fd52e62e72a645a86c5a0261ba040f5eb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cbffb16a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '91e75f1f6bdc04b2685d53b4ad2d7effb64395076b6ad77e6cd2d7333cb4230d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cbffbcf0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': 'd215c8f30cc2cc7658c8af9d0d451befc3309ec0f4f4bacfebee0b28ac0ecaac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 'timestamp': '2026-01-30T09:33:55.662082', '
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 69-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cbffc736-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': 'e75bceb4573f5feaa04992489910b53ab686583633b33fd9e3cc13d26095fb90'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.662082', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cbffd17c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '3eb782dc6d6ecedbd1c98cca1d38f5ac92eb054d175a292caac58dd721b78f6e'}]}, 'timestamp': '2026-01-30 09:33:55.664057', '_unique_id': 'bcf72767c5aa44049c127f47c899bc3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.665 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.665 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.666 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.666 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.666 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.666 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.667 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.667 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.667 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79dcadf3-2183-4cde-a713-5f5e3e775416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cc00292e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '95950fd03be582eb78648d300ccbb0d110d996bf1398e49361f1fca353f1ee76'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cc0034dc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': 'aa6a19c1a5b466a74658329dc3fd9d1a794e80a1912ba78bc98033a6ab097b8d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cc003f9a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '31beabc26222be224d508d95e886ecdfc67977c2d8a223bef56e9fb15ac66e5f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cc004a08-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '50234a295cf35db4127948f008510e4fcd95d162f176a825aa8482bbc06111e5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cc0056c4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '5e1a4fd4d79d50f4367e4496e280ccd370cba13dd7a78c16247ec471d03aecf2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e0
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cc00616e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': 'e2b9790cec02e07f5046f157646e7f100e8bafcc42e8e277d0ddb1a59c5272ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.666005', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cc006b6e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '0cbe555ceb8541cfb24da6d1dc41c34ac99a6731705f0d1b76adfddbec69d4c6'}]}, 'timestamp': '2026-01-30 09:33:55.667984', '_unique_id': '0c78f600487f4ffab37c2f0ad305895a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.679 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.680 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.688 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.688 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.699 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.699 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.707 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.475 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.707 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.714 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.715 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.478 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.644 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: _gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.735 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.736 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dbc14ca-14ce-4832-981a-af6bb594e4ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0247fe-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '3e86f4e79370a6fc856ac3c27a80c3477e110f3b8eaeeb5b6ac0f5e85ad719fc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc025578-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '8c268be061810beff5797c44c2feb7c5d3be94ddab6621013e4012e62d5857d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc039140-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': '6d7abec0177e25410666a826fdca265d108cccd58185621cf3e9c4e93520b704'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc039c76-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': '49bfd09d4b07ac5fc3e40fa669de390e4aed9f0bf44ef8338096b02134f1bf9b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc054904-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': '5d995d4c7cfe951b60c6f672f037ebb325a75376fbc7850b5d48a3c3e2e97407'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpu
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: ssage_id': 'cc055246-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': 'ae71c41f8ffbdbb56b76aa21c831c7c69cb7a08fc3d50fd9235ffed442094b47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0673a6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': '10e82f83113147e012bf97991b8debaf89b53d25a33a83fba65f040f728b9af3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc067c48-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': '0bbaf7b996a9a681dd92478bd5fa6cdce839d930a491b8fc9f1570c75f2b48cd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc07a69a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': '92425d085cce968ad936d47be52309fe4b94bfa3707a82153db2335debaa109d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc07b3a6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': 'cdcb3a009c3e17c6c9667385623e9c321115d7c5acfc383d2c637bb68ad27a42'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0ad05e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': '76a9967eedbd1fa51f689d505e6b74d01da7b7c2007952c059968f917ccaee66'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.669609', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: k_name': 'sda'}, 'message_id': 'cc0ade5a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': '127b3d36d28c46fd354efcf36f449ae876bb46ba81f0e02d5432c7a76b83b244'}]}, 'timestamp': '2026-01-30 09:33:55.736492', '_unique_id': 'b2cdf10efaf2485585de341a1bc32051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.738 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.738 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.739 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.739 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.739 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.739 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.740 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.740 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.740 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.741 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.648 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.741 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.741 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.741 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '482cee80-2bbd-4af1-93a4-383dbe398bdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0b457a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '9eb25f7444d6173a3dc69ba7159ca4c7def1f0883ad1deb2b9bb9f19755e4c56'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc0b52d6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '0b3c76d43c26780438ddd7234cf499fa9b5d90c30a31ac346816e4613065a569'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0b5d62-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': '6082162ec1049f8034815bdafbbb167f11c9e25865c734cd3bb5e0de0301e82f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc0b676c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': '4dc5afad571f2b3f64f2cbddfb0f5e34607428e0c5926cbfdf3426665754b262'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0b714e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': 'ecb7cd4c9faf5aa7b2b3abfd5f8ba8cbe96c5c17f8a3ccf6f2399d62567253f0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, '
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': '14563cd298ae87903354cb84323c50c0060dba9f35886799d448c9dc15ce8809'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0b865c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': '77b307f67b76d040106fd811703b4154991437f08de8e06fd56c1ecc9f4118bf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc0b9066-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': '3ee92bd7d5fdab2b301e5fe90cdfe5501486ce8751b30bc64b31e64a2c2ac900'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0b9a7a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': 'adc1b647acbcaff6caada5120fbaf8f5edbe7f83bd3385a18b531ddcb1c8b203'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc0ba560-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': 'b8cb995a91745ec44c2ef073f79ae035c7d381cc4a48da348c537df6fac7e38f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc0baf88-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': '50039d468b538ead2cf53fb05fad6c7ef2402565b27e17808820f8489f50d820'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.738798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_g
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': '1d08e01dbeb954877583919b6f133514cfbeb7431a5f91483a9c3a843c7e768f'}]}, 'timestamp': '2026-01-30 09:33:55.742079', '_unique_id': 'bf352fae5f864876a676067403d2a9c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.743 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.744 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.744 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.744 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.745 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.745 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.745 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0f64e8c-7a9f-4ab0-8631-b87c470462ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cc0c0c94-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '6deacb130325cd44ff5175f3918650cc718b12062a1da5f4fb714ae830c320bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cc0c1900-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '4e5fb9bb0c0f2f004900f55864b0f18a8451c6e6b16e477f792bc0881c59b2a3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cc0c2396-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'a332a19368e92cace7132b7f7232f120fed406babee912e4c3d2c2125ce9c785'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cc0c2de6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '70294660cbb10edc736168a1ec1ab6f2bc1211b1e709e3e39d779658118754a8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cc0c38fe-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '697fc61a5861dc83c7fa99d6cd0aa2dee76704f0694b4df6c93a5d41edd6d73d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 'timestamp': '2026-01-30T09:33:55.743899', '
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 69-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cc0c4394-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': '24b572b584f4324f78465f32bd348aaae41357444231c32366cd22121f1c5fb0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.743899', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cc0c4dda-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '6f66b52caf5d6cd1fc159c306ee50633254caad9656fd6cb19824ec2ea5fb680'}]}, 'timestamp': '2026-01-30 09:33:55.745884', '_unique_id': 'e79d4e67028d47bb977764192912e2e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.653 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: ': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cbfdc [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.747 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.747 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.747 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.656 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.661 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: k_name': 'sda'}, 'message_id': 'cbff0b3e-fdbe-11f0-a471-fa163eabe782', 'monotoni [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.664 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.668 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.737 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: ssage_id': 'cc055246-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 nova_compute[183130]: 2026-01-30 09:33:55.779 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.742 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature':  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.746 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.794 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/memory.usage volume: 40.44140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.808 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/memory.usage volume: 42.68359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.823 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/memory.usage volume: 46.78125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.841 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/memory.usage volume: 42.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.855 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/memory.usage volume: 42.69140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.871 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.871 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9: ceilometer.compute.pollsters.NoVolumeException
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bda21b7a-324f-4875-80fd-cee73ce80d3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44140625, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'timestamp': '2026-01-30T09:33:55.747945', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cc13c178-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.188880195, 'message_signature': '8a5db2d44160f797d5b22b9c1f03cd1651c001e38f4298066e0a0203c792f59e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.68359375, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'timestamp': '2026-01-30T09:33:55.747945', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cc15f650-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.203404985, 'message_signature': 'afa7eccd09b507e0997c955a5cd9dcd5da597588f171d95dae6e5958267dd7ae'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.78125, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'timestamp': '2026-01-30T09:33:55.747945', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cc1841d0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.218517042, 'message_signature': '294fdffca5af2186c9a19720f84f1f45ec5860b60de60e73e850ed7c48e92086'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.81640625, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'timestamp': '2026-01-30T09:33:55.747945', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cc1afd58-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.236359308, 'message_signature': 'de566359e4f8524614727811c1081769110487b17f99eb076f3716d5fae742ff'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.69140625, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'timestamp': '2026-01-30T09:33:55.747945', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cc1d17e6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.250240579, 'message_signature': 'b0d76f56738d0520d66b57f65fa5b6c00b5c2ec2fcbc9199c3fbbd8a73e41fd9'}]}, 'timestamp': '2026-01-30 09:33:55.872041', '_unique_id': '16b4e357f4704c06afee04b613b4f755'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.872 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.874 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.874 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.874 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.incoming.packets volume: 57 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.874 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets volume: 515 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.875 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.875 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.incoming.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.875 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.incoming.packets volume: 85 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.876 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e3bd6bd-f980-409c-8b65-10f8f5a7d685', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.874150', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cc1fef52-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': 'fef169de84b17fab90225b98b02f9a7e7cc4e1b1530fe14610ec21edf073ba86'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 57, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.874150', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cc1ffc5e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': 'f261467766beee68702b7db7ba9b5078e16f545ed6513220206c5b26dd4d0875'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 515, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.874150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cc200870-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '0d1d03320a42f677b68962886282435c36f784b9690521e872d4a248dbf0ad9a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.874150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cc201568-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': '9edc4fd50e49046950888da0ebeb044011a45bb05666fcf327782f7d61ebc9a0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.874150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cc20201c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': '41d8895adeae7f258d6e85f92cc023e78adbe7de2031cfcbabb9ddd1050bc8ee'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 85, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 't
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.878 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.878 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.incoming.bytes volume: 10267 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.878 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.bytes volume: 99596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.878 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.incoming.bytes volume: 2856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.879 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.incoming.bytes volume: 3326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.879 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.incoming.bytes volume: 15817 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.879 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acc8a85e-40a0-49a7-b03c-4363f1b120f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.878069', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cc2086e2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '63ae080fc88be9f2d9de7651a8b1e9197f08d66db781ee578f8e350bf4a9eed7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10267, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.878069', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cc209254-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': '2c49f3fb97432b44272450a1b404c14b8a002733983fc572b19c7db719486a43'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 99596, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.878069', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cc209cf4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'b4094e0eed9ed637ac023d01c755dd42cad62bdba60a1bb92d6508bacae2a68e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2856, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.878069', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cc20a848-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'f770b933dfb7697e23fa0591faf0d24cc840e98c6c50d98c13f3493c53d0f9c7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3326, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.878069', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cc20b3f6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': 'e71c00587b199fbc4157b289b277f6798981d15de0b2386148be41591cc34c56'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15817, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e035c5-tap8472693d-cc', 'timestamp': '2026-01-30T09:33:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.881 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.882 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.882 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.882 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.883 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.883 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.883 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fec8a0cb-2b45-4cb2-a3f8-dba9ad33ef48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-00000020-6a7e9f4f-a651-4817-a679-b45828fcf5af-tap4ebb9e31-70', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'tap4ebb9e31-70', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:a5:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ebb9e31-70'}, 'message_id': 'cc21188c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.855820001, 'message_signature': '8005b330abd94c07c4684478f3364c4a76aeb916ba8d0b94872987497fcaea61'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': 'instance-0000001d-8aafaddd-1368-427e-8596-2b5871053f79-tapc0d4f325-5a', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'tapc0d4f325-5a', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:bd:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc0d4f325-5a'}, 'message_id': 'cc212598-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.858732465, 'message_signature': 'ce4823a32c68a29edf55151f41edd4d2153d15fc21608986d4f401b89b93f1e3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tapf469de0f-e3', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tapf469de0f-e3', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:3e:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf469de0f-e3'}, 'message_id': 'cc21304c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'e9bc92294b8c8fa902093d594c577af482baf2defe70b8718d491b683c53aef0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001b-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-tap3e8e7ac3-77', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'tap3e8e7ac3-77', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:3b:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e8e7ac3-77'}, 'message_id': 'cc213a74-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.860994611, 'message_signature': 'a1e27d929be5e154e9d9d6f5d7639c32010dc1802a8d71e1fb84b82317e6fbbf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000001e-65c66677-23b6-479a-863f-3dd277183a7d-tapa5afd5ba-13', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'tapa5afd5ba-13', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:d8:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5afd5ba-13'}, 'message_id': 'cc214596-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.863567495, 'message_signature': 'bb284f780de28bb053c6eef534e12ba76cfb28dc29574bbeb59e2dd36e104210'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-0000001c-aed146e3-865d-4aee-a055-42ed41e0
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'tap8472693d-cc', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:87:b2:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8472693d-cc'}, 'message_id': 'cc215248-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.865196032, 'message_signature': '9916eb799e4ff16d1279f5d8889f45190b5509a7eedcdd3f154b9778d0a0eeb7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'instance-00000021-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-tap6b04d832-45', 'timestamp': '2026-01-30T09:33:55.881830', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'tap6b04d832-45', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9d:42:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b04d832-45'}, 'message_id': 'cc215c66-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.866961853, 'message_signature': '76f8948a8e50d03f9baf7f96326082dd3be245fa8d8dd29472d39c82a4f49cf4'}]}, 'timestamp': '2026-01-30 09:33:55.883877', '_unique_id': '62b28eab50a345788c5b3d9ebc957f49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.885 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.885 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/cpu volume: 9810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.885 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/cpu volume: 11040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.886 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/cpu volume: 11760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.886 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/cpu volume: 10950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.886 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/cpu volume: 11140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.887 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/cpu volume: 6050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0f94665-3784-4329-90ea-6b4b9f9e1aad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9810000000, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cc21acf2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.188880195, 'message_signature': 'f5bf18ac1747314f7efcb9e0331851e65031396db142e6dc346c0369b2ccee43'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11040000000, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cc21b9c2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.203404985, 'message_signature': 'b4499b02b57511b0f9ba2987844ebf29fc4c1a67d403c202b5c7b6dcf88c29b9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11760000000, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cc21c7a0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.218517042, 'message_signature': '23949f73d77fca0c5afb7a9e4ea67dd65f3c46d71c8c3e1130b8c4c889e62c17'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10950000000, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cc21d484-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.236359308, 'message_signature': '60cc9cb1c63374caade4533618afd682c6a980a36b9f1d171b52012e3a821bd5'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11140000000, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cc21e0b4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.250240579, 'message_signature': '44ba91f0bfbbb038435217ee0bb614c908eb69d99fa739b71f524454e9cf0090'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6050000000, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'timestamp': '2026-01-30T09:33:55.885622', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb'
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.266422336, 'message_signature': '0b7bfa4e9c80758a31a91cb671df73d71c136a26812741b38363390774f402c8'}]}, 'timestamp': '2026-01-30 09:33:55.887610', '_unique_id': 'da5e2bb74320420e8eeb610e771d4c95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.889 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.889 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.890 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.890 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.890 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.890 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.891 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.891 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.891 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.892 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.892 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.892 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e625bcd7-89bc-49ad-b36f-d8f7e3a08f1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc224180-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '5dd912de329d60673303346dffaa82f1fee2b11ec02b93da3786cf855ced7a7f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc224d9c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.064521852, 'message_signature': '1d16c6f08d3dea6b0f4eae3d5d0069492b8b12f5621ea79b16572cfeb2c64033'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc2258c8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': '40b11440b45a2b2570bede74577e3ee9901da2dc723c77eb163df67145125eee'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc22664c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.075412787, 'message_signature': 'd13be2e73d8721386ce1d2e7f5372329eb1e6b1028e126c052761a191b32c612'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc2270a6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': 'b326c7b69c190a89d18affbcfe0eba20a1568b5e45c5d921bd29924f3b45064d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', '
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 'message_id': 'cc227d76-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.083775898, 'message_signature': '4710188197f048eb4444b5e6b6acb08713d5aae78e0ff9496216c44e2cf81dcf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc228c62-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': 'eb34fc816d280b1fad998f97e611e0099f4e7cd12d1f50e38d45b7e0518c530e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc2296b2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.094940121, 'message_signature': '38daa3ce1696bc8b41739f74759c7c7010ade9c5ac2539b13ccd8b518bcaaa9b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc22a0b2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': 'dd2e1edc5e1f11e491f371f39d4539fe60a79f3528ee58d3b743a7c2c0574aef'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc22ab52-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.102565681, 'message_signature': '0db42d9e410a44d5ce1ed83893ab14f1d7a5dc442285f55e5e8618dce3dfa6c1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc22b426-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': '28275a3cad61d4f9852d78f53e71da128b7b4c2ae66d74938851ebe74abf8d7b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.889444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 1, 'disk_name': 'sda'}, 'message_id': 'cc22bbba-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.110597373, 'message_signature': 'bef0dc664fa41aee273acad507f4c187c303e0fc09b98c74d8565c697e39232d'}]}, 'timestamp': '2026-01-30 09:33:55.892856', '_unique_id': 'ca99bb4b64084cde89e41090fe460d74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.894 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.894 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.894 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.895 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.bytes volume: 31054336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.895 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.895 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.bytes volume: 29248000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.895 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.895 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.896 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.896 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.bytes volume: 29878784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.896 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.896 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.896 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '307aa91e-71f0-46a9-891c-078888257762', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc2308fe-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': 'bb15c9a9a525acd4e45a1428faf65e0387fffbb66b681687ca0a905b9bba5b95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc2312c2-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '161e7d966dfd1a9daf8f459fb64653404b8cb527f52386d9037f0d24b188307c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31054336, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc231eb6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': 'd791071a99f20a7695217ed3bb70499c315694b3b7c3ec643f3fc99c3f368f4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc232622-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': 'b6269b7735a33310c05b23e0fbf36ba83f2efc7cddd3c0b3b3a571eab98c085c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29248000, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc232cf8-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'c072d8ec5851d13e508ee21f8f7604d7216e327dc3d53e1eafc13d71e54bd94f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture':
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: t_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23339c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'a72ac73a1634310436a3ce83df802bb32c9c5f24c98ed0ecafc52cb01c889a7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc233a40-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': 'dd3ecd88550c64ba986e5ca3f0fd833a3c5c30bb708df889afc9d5e31c0c2f07'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc2340bc-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '7f2f5ca079afa8e133abefd530d7bd6c188077636fd211281818b647df60c921'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29878784, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc234846-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '04155561b7f92d1e3d245bfaef5e0c491f442f02683f4ebe126838db018633b3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc234ee0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '84a6abf1cc4e6f29c4379d49bcc4d031f9a819d2037d4b36218d050b15e05f00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc235584-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': 'b5c9a20920c7789bba51cb2fb17a3fcbef53206d8d2520bff862108bd49efaf3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.894566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc235c1e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': 'dc3bec5b038e2800579b125645e7ad47755f5c556759081222c2bcc230bf6b95'}]}, 'timestamp': '2026-01-30 09:33:55.896924', '_unique_id': 'f3505f40afd94fa58e220f8b56c34276'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.requests volume: 319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.898 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.requests volume: 370 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.requests volume: 316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.899 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bcf2ba5-b48e-4e88-8850-5e22e4a38eb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc2392ce-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '82382ca8a0fcbb006790d0b391cf8feb6bd1529eca02dd5632d807e8a0ea43e5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc2399e0-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '2d4e98a1ccf58a4f3ca8eb47e9e4285d12d397fa38387a5c722ce907942eefb1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 319, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc23a084-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '3e382a7febc493519b8a327526ec500b3482b3066dbd6c2c08fea8179aea97aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23a728-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': 'ec235c1f97e460cd5806035b8df1bc7c17e8c249e8537d6c6be48fb6e66a69a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 370, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc23add6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': '513d7b1ddea44c3d1c68c73a05542dc63fb66b66e87ae46eafa379a63ef31b67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_r
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: b': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23b588-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'cecca4084e2014207ecaa18faa6485a41b09c98dff81f1d558c98ac17c19aff0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc23bd12-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '4c0735d9cff22851b110644f5293cc34f0cf0be0302fcc4acb05d1a3a247972a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23c3ac-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '2d98f418137e92746cbba4fc9a2473639e70c65342585b47d4ce0cba1d3559f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 316, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc23ca46-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '81dd6bb5996d5c831690ec2e1ee8b34c8dfac40906865c302ddd9dbba24a11e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23d0d6-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': 'bbba42c4ccf005d14620f35acba2f4eac3ad0f4562b099aa6ea1e538d4e01768'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc23d770-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': 'bdde5b967c76bd4c7b4081f93b9e0039f622d080f0c6047f2a4246e6e66d7a6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.898113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a1
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: e, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23de0a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '82cc7a4cabf30336e12fd67ec495b66e2636fe005c964e35b89d01f0a9545276'}]}, 'timestamp': '2026-01-30 09:33:55.900275', '_unique_id': '4190c2816965415f8ef05d801a04ae19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.901 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.901 12 DEBUG ceilometer.compute.pollsters [-] 6a7e9f4f-a651-4817-a679-b45828fcf5af/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.901 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.latency volume: 3746952270 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.901 12 DEBUG ceilometer.compute.pollsters [-] 8aafaddd-1368-427e-8596-2b5871053f79/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.902 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.latency volume: 1390121376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.902 12 DEBUG ceilometer.compute.pollsters [-] 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.902 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.latency volume: 1623334960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.902 12 DEBUG ceilometer.compute.pollsters [-] 65c66677-23b6-479a-863f-3dd277183a7d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.902 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.latency volume: 2958452088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.903 12 DEBUG ceilometer.compute.pollsters [-] aed146e3-865d-4aee-a055-42ed41e035c5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.903 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.903 12 DEBUG ceilometer.compute.pollsters [-] 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a08170ac-5c88-4c2f-a153-d61fcb617f83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc24133e-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': '338e259e7a44307df334645c42ece8740e4ae7734e766df9ff426eaae7ddd0df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-85653886', 'name': 'instance-00000020', 'instance_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6e939c4-3cd5-464f-b227-1809e53fe850'}, 'image_ref': 'a6e939c4-3cd5-464f-b227-1809e53fe850', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc241a14-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.874399878, 'message_signature': 'fc197e9f8a0e3fb4bc73cc28435c4ccac68b5c74a68ed1011a2ab09fc7ab8f4d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3746952270, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc24209a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': '28844ea4a365cdd5ece993ba7c0f4c7d0b67c101f538a4b8ab2b219e7e7d3d37'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7701defc672143599a29756b7b25b4dc', 'user_name': None, 'project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'project_name': None, 'resource_id': '8aafaddd-1368-427e-8596-2b5871053f79-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1086942880', 'name': 'instance-0000001d', 'instance_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'instance_type': 'm1.nano', 'host': '77e64c3a73043e18ce1eda0cf07b0b25b748b6325d1cae1c118d8ee9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc242734-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.896943219, 'message_signature': 'af8621243a4e1a3d4b598ac8ab5f99aa3bd26cbf9b87743e92d25aae074f8fd7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1390121376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc242e78-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': '6465f8356ed67894ab13ba732881af569ad5a84b5eb20ac09107668105a6f4f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-889211547', 'name': 'instance-0000001b', 'instance_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'archit
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc24353a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.922749275, 'message_signature': 'a793266879bfb2d9c2a01a36bf37ceda2a45bfc9b23a64863f89b62c88f81077'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1623334960, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc243bd4-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': 'a78255d6b6c9b89af1e48392aa6574a9fe6cfab16aa4f36537327cdca3219fbd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '65c66677-23b6-479a-863f-3dd277183a7d-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-667740087', 'name': 'instance-0000001e', 'instance_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc244264-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.949416106, 'message_signature': '67612a8c8ef33d6c54d7e3bf92d6ef9c77eae8a230177f3afe44274f1c1203b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2958452088, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc2448fe-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': '527b83fc501d6682a1d63d50b3d3882e34071d162117dff6f3f803bc319086d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': 'aed146e3-865d-4aee-a055-42ed41e035c5-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611', 'name': 'instance-0000001c', 'instance_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc24509c-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4166.976463347, 'message_signature': 'f60c30e542c0cdd5ab2eeb8476e8d089b22cc14cfe2ac29981ecdca182de7612'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-vda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cc245a42-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '3c2483ae27387d7ff60228f0e76d57d9470fbf6548ffbe72590b02f1079bc1c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '594b0fd4bee7455ab5aac7774bd07b70', 'user_name': None, 'project_id': '396e2944b44f42e59b102db87e2e060c', 'project_name': None, 'resource_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-sda', 'timestamp': '2026-01-30T09:33:55.901421', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652', 'name': 'instance-00000021', 'instance_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'instance_type': 'm1.nano', 'host': '02a48879787e91c16c3384bdc8a260e648ef90ba51e764982fd11a77', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: ', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc24614a-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.006168616, 'message_signature': '0f8e4e85aef3100d3efc9f9f6aa40689c3904bae206d8fe0a0c391d4e9b076cf'}]}, 'timestamp': '2026-01-30 09:33:55.903611', '_unique_id': 'ccdcd83574034361932eb20d90e8dda9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.904 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:33:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:33:55.905 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-85653886>, <NovaLikeServer: tempest-TestSnapshotPattern-server-1086942880>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-889211547>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-667740087>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652>]
Jan 30 04:33:55 np0005601977 nova_compute[183130]: 2026-01-30 09:33:55.913 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (10923) with configured size 8096, begin of message is: 2026-01-30 09:33:55.877 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (10888) with configured size 8096, begin of message is: 2026-01-30 09:33:55.880 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.884 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.888 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.893 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is:  'message_id': 'cc227d76-fdbe-11f0-a471-fa163eabe782', 'monotonic_time': 4167.08 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.897 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: t_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23339c-fdbe-11f0-a471-fa163eabe7 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.900 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: b': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc23 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is: 2026-01-30 09:33:55.904 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:55 np0005601977 rsyslogd[1006]: message too long (8192) with configured size 8096, begin of message is:  0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cc24353a-fdbe-11f0-a471-fa [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.014 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.part --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.015 183134 DEBUG nova.virt.images [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] 2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.016 183134 DEBUG nova.privsep.utils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.017 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.part /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.164 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.part /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.converted" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.167 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.188 183134 DEBUG nova.compute.manager [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.188 183134 DEBUG oslo_concurrency.lockutils [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.189 183134 DEBUG oslo_concurrency.lockutils [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.189 183134 DEBUG oslo_concurrency.lockutils [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.189 183134 DEBUG nova.compute.manager [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.190 183134 WARNING nova.compute.manager [req-1b584f46-355f-4cbc-9b42-433538830bf4 req-5ab22ab8-5fda-468b-b8e8-27cbfd29c24d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.233 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec.converted --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.234 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.253 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.326 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.328 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.329 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.349 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.410 183134 DEBUG nova.network.neutron [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updated VIF entry in instance network info cache for port 6b04d832-453a-4046-a311-7f401c10412f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.411 183134 DEBUG nova.network.neutron [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updating instance_info_cache with network_info: [{"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.424 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.425 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec,backing_fmt=raw /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.457 183134 DEBUG oslo_concurrency.lockutils [req-e9820f24-7fb6-45d6-a8b9-27c4a0cc0ed3 req-4c3af74e-dc8f-4d75-857f-1f9ad8ab0544 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.471 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec,backing_fmt=raw /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.472 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e5be927f4d5d3cf8a551fcd7e66a81d6274021ec" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.472 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.522 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.523 183134 DEBUG nova.virt.disk.api [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.523 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.567 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.568 183134 DEBUG nova.virt.disk.api [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.569 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.569 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Ensure instance console log exists: /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.570 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.570 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.571 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.574 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start _get_guest_xml network_info=[{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:22:00Z,direct_url=<?>,disk_format='qcow2',id=2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.578 183134 WARNING nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.586 183134 DEBUG nova.virt.libvirt.host [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.587 183134 DEBUG nova.virt.libvirt.host [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.589 183134 DEBUG nova.virt.libvirt.host [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.590 183134 DEBUG nova.virt.libvirt.host [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.591 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.591 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:22:00Z,direct_url=<?>,disk_format='qcow2',id=2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.592 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.592 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.592 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.592 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.592 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.593 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.593 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.593 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.593 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.593 183134 DEBUG nova.virt.hardware [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.594 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.612 183134 DEBUG nova.virt.libvirt.vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:54Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.612 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.613 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.614 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <uuid>0e693c72-183a-4005-8891-207b95ad22b1</uuid>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <name>instance-0000001f</name>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-207746804</nova:name>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:33:56</nova:creationTime>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        <nova:port uuid="747cab40-fbad-4008-a7ac-6cf1f12b6ee4">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="serial">0e693c72-183a-4005-8891-207b95ad22b1</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="uuid">0e693c72-183a-4005-8891-207b95ad22b1</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:99:de:f3"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <target dev="tap747cab40-fb"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/console.log" append="off"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:33:56 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:33:56 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:33:56 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:33:56 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.615 183134 DEBUG nova.virt.libvirt.vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:33:54Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.615 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.616 183134 DEBUG nova.network.os_vif_util [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.616 183134 DEBUG os_vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.617 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.617 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.617 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.620 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.620 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap747cab40-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.621 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap747cab40-fb, col_values=(('external_ids', {'iface-id': '747cab40-fbad-4008-a7ac-6cf1f12b6ee4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:de:f3', 'vm-uuid': '0e693c72-183a-4005-8891-207b95ad22b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:56 np0005601977 NetworkManager[55565]: <info>  [1769765636.6236] manager: (tap747cab40-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.627 183134 INFO os_vif [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb')#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.670 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.671 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.671 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:99:de:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.671 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Using config drive#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.688 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:56 np0005601977 nova_compute[183130]: 2026-01-30 09:33:56.723 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'keypairs' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.131 183134 INFO nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Creating config drive at /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config#033[00m
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.136 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd2f2tyn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.256 183134 DEBUG oslo_concurrency.processutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd2f2tyn" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:33:57 np0005601977 kernel: tap747cab40-fb: entered promiscuous mode
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.3203] manager: (tap747cab40-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Jan 30 04:33:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:57Z|00300|binding|INFO|Claiming lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for this chassis.
Jan 30 04:33:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:57Z|00301|binding|INFO|747cab40-fbad-4008-a7ac-6cf1f12b6ee4: Claiming fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.326 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:57Z|00302|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 ovn-installed in OVS
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.338 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.338 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:de:f3 10.100.0.12'], port_security=['fa:16:3e:99:de:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0e693c72-183a-4005-8891-207b95ad22b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '90463a7d-c3a0-4624-975d-0cc4b6ff9814', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.244'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a98fd02a-19ea-434b-9ec2-1fdf64f82e5f, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=747cab40-fbad-4008-a7ac-6cf1f12b6ee4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:57Z|00303|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 up in Southbound
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.340 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 in datapath b2ca1571-8ba0-4f98-bb63-cbd6ba450393 bound to our chassis#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.342 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2ca1571-8ba0-4f98-bb63-cbd6ba450393#033[00m
Jan 30 04:33:57 np0005601977 systemd-udevd[220229]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.3549] device (tap747cab40-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.3555] device (tap747cab40-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.355 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e4e4ac-7859-4917-aa0b-c28d5835d3d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.356 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2ca1571-81 in ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:33:57 np0005601977 systemd-machined[154431]: New machine qemu-27-instance-0000001f.
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.358 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2ca1571-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.358 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ef86a6-f1dd-4b65-91ca-9665073c1e20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.359 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a57298d3-8bea-430d-878e-16d3e98c6f01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 systemd[1]: Started Virtual Machine qemu-27-instance-0000001f.
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.368 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[69babafa-680e-4082-824c-8e221eeba23f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.385 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.385 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.386 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:57 np0005601977 podman[220208]: 2026-01-30 09:33:57.387915213 +0000 UTC m=+0.072315911 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.391 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[14954023-37d3-4dc7-9b43-934bd81290a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.409 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c153af85-dc41-4eec-8f10-8bc49a754cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.4141] manager: (tapb2ca1571-80): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.412 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fd033492-86c1-41ac-be79-36637e708668]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 systemd-udevd[220233]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.433 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[09bce6b3-d413-40e2-8a9f-de899457fadf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 podman[220207]: 2026-01-30 09:33:57.433941373 +0000 UTC m=+0.118659580 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.436 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[286edca0-c81e-454b-a4b8-df01512d1d6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.4498] device (tapb2ca1571-80): carrier: link connected
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.453 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d8c06f-690b-48a2-af54-903787d11475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.466 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b18b13af-34f3-42f3-aa3d-be68bd4f65c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ca1571-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:1f:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416879, 'reachable_time': 16076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220288, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.479 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[142a5cb6-9fb7-4b07-a5b0-5947b35bf3c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:1f08'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416879, 'tstamp': 416879}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220289, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.493 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[509c9cd4-55aa-462c-971e-ca0c9c3c32e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ca1571-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:1f:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416879, 'reachable_time': 16076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220290, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.514 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fea43e7d-2723-4608-ba7d-ced19847e588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.550 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bab39542-8e1d-491d-9a18-06d349d5eb24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.552 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ca1571-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.552 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.552 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2ca1571-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:57 np0005601977 kernel: tapb2ca1571-80: entered promiscuous mode
Jan 30 04:33:57 np0005601977 NetworkManager[55565]: <info>  [1769765637.5551] manager: (tapb2ca1571-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.555 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.557 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.557 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2ca1571-80, col_values=(('external_ids', {'iface-id': '92996e6c-be8d-4868-a92b-0dd619c09c89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:57Z|00304|binding|INFO|Releasing lport 92996e6c-be8d-4868-a92b-0dd619c09c89 from this chassis (sb_readonly=0)
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.558 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 nova_compute[183130]: 2026-01-30 09:33:57.563 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.563 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.564 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c037e777-dc18-4c4a-ab49-a133240b36fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.565 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b2ca1571-8ba0-4f98-bb63-cbd6ba450393
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.pid.haproxy
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b2ca1571-8ba0-4f98-bb63-cbd6ba450393
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:33:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:33:57.566 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'env', 'PROCESS_TAG=haproxy-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2ca1571-8ba0-4f98-bb63-cbd6ba450393.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:33:57 np0005601977 podman[220323]: 2026-01-30 09:33:57.868894822 +0000 UTC m=+0.041898921 container create 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:33:57 np0005601977 systemd[1]: Started libpod-conmon-8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628.scope.
Jan 30 04:33:57 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:33:57 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac46e4207006103a28ccdcdc19d9524c9dabf760bc0ed83292390ac0698dfb56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:33:57 np0005601977 podman[220323]: 2026-01-30 09:33:57.934889949 +0000 UTC m=+0.107894068 container init 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:33:57 np0005601977 podman[220323]: 2026-01-30 09:33:57.941603953 +0000 UTC m=+0.114608052 container start 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:33:57 np0005601977 podman[220323]: 2026-01-30 09:33:57.846487515 +0000 UTC m=+0.019491644 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:33:57 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [NOTICE]   (220343) : New worker (220345) forked
Jan 30 04:33:57 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [NOTICE]   (220343) : Loading success.
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.293 183134 DEBUG nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.294 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.294 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.295 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.295 183134 DEBUG nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.296 183134 WARNING nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.296 183134 DEBUG nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.297 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.297 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.297 183134 DEBUG oslo_concurrency.lockutils [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.298 183134 DEBUG nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.298 183134 WARNING nova.compute.manager [req-cfa19e54-1c99-4efe-9bdd-6271e045fd07 req-44239fa3-db3e-4537-a71c-55ff331780f7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.361 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Removed pending event for 0e693c72-183a-4005-8891-207b95ad22b1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.361 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765638.3608446, 0e693c72-183a-4005-8891-207b95ad22b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.362 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.364 183134 DEBUG nova.compute.manager [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.364 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.366 183134 INFO nova.virt.libvirt.driver [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance spawned successfully.#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.367 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.393 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.397 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.403 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.403 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.404 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.404 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.404 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.405 183134 DEBUG nova.virt.libvirt.driver [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.432 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.433 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765638.3617938, 0e693c72-183a-4005-8891-207b95ad22b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.433 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Started (Lifecycle Event)#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.456 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.459 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.471 183134 DEBUG nova.compute.manager [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.481 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.527 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.527 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.527 183134 DEBUG nova.objects.instance [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 30 04:33:58 np0005601977 nova_compute[183130]: 2026-01-30 09:33:58.600 183134 DEBUG oslo_concurrency.lockutils [None req-de7ce96e-efbe-4c1b-8c76-ce876b847058 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:58Z|00048|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.14 does not match offer 10.100.0.8
Jan 30 04:33:58 np0005601977 ovn_controller[95460]: 2026-01-30T09:33:58Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:52:a5:03 10.100.0.8
Jan 30 04:34:00 np0005601977 nova_compute[183130]: 2026-01-30 09:34:00.782 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:01Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:42:f4 10.100.0.8
Jan 30 04:34:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:01Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:42:f4 10.100.0.8
Jan 30 04:34:01 np0005601977 nova_compute[183130]: 2026-01-30 09:34:01.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:01 np0005601977 podman[220370]: 2026-01-30 09:34:01.850247032 +0000 UTC m=+0.068269474 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:34:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:02Z|00052|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.14 does not match offer 10.100.0.8
Jan 30 04:34:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:02Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:52:a5:03 10.100.0.8
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.367 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.451 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.496 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.497 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.541 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.546 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.598 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.599 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.640 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.647 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.690 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.691 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.733 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.739 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.782 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.783 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.840 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.846 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.890 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.891 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.933 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.939 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.984 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:02 np0005601977 nova_compute[183130]: 2026-01-30 09:34:02.985 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.028 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.036 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.082 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.084 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.129 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.343 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.345 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4615MB free_disk=73.10082244873047GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.345 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.346 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.443 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.444 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance aed146e3-865d-4aee-a055-42ed41e035c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.444 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 8aafaddd-1368-427e-8596-2b5871053f79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.444 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 65c66677-23b6-479a-863f-3dd277183a7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.444 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 0e693c72-183a-4005-8891-207b95ad22b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.445 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 6a7e9f4f-a651-4817-a679-b45828fcf5af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.445 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.445 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.445 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.606 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.671 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.710 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:34:03 np0005601977 nova_compute[183130]: 2026-01-30 09:34:03.711 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:03Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:a5:03 10.100.0.8
Jan 30 04:34:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:03Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:a5:03 10.100.0.8
Jan 30 04:34:05 np0005601977 nova_compute[183130]: 2026-01-30 09:34:05.786 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:06 np0005601977 nova_compute[183130]: 2026-01-30 09:34:06.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.158 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.159 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.160 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.160 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.160 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.162 183134 INFO nova.compute.manager [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Terminating instance#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.162 183134 DEBUG nova.compute.manager [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:07 np0005601977 kernel: tap6b04d832-45 (unregistering): left promiscuous mode
Jan 30 04:34:07 np0005601977 NetworkManager[55565]: <info>  [1769765647.1903] device (tap6b04d832-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.199 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:07Z|00305|binding|INFO|Releasing lport 6b04d832-453a-4046-a311-7f401c10412f from this chassis (sb_readonly=0)
Jan 30 04:34:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:07Z|00306|binding|INFO|Setting lport 6b04d832-453a-4046-a311-7f401c10412f down in Southbound
Jan 30 04:34:07 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:07Z|00307|binding|INFO|Removing iface tap6b04d832-45 ovn-installed in OVS
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.203 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.208 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:42:f4 10.100.0.8'], port_security=['fa:16:3e:9d:42:f4 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '12406b2c-7c9c-41b8-b0c7-30bf4455b4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-632dc37f-a471-48f7-998e-601c234d5eea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '46e120fc-27ce-4640-b22b-ca03372cbb62', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3c73d5-3fb7-4892-bbfe-678dc6ae4603, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=6b04d832-453a-4046-a311-7f401c10412f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.210 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 6b04d832-453a-4046-a311-7f401c10412f in datapath 632dc37f-a471-48f7-998e-601c234d5eea unbound from our chassis#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.214 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.213 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 632dc37f-a471-48f7-998e-601c234d5eea#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.223 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[adfb00a6-3d07-419a-8718-752fda501388]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 30 04:34:07 np0005601977 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000021.scope: Consumed 12.567s CPU time.
Jan 30 04:34:07 np0005601977 systemd-machined[154431]: Machine qemu-26-instance-00000021 terminated.
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.253 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[fc70996d-5403-492f-963c-80fdaba8888a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.256 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f8da21c6-6956-4dde-9a09-4be833f5fbff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.278 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[1f464f93-cb9a-417d-b322-e753d70f38a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.292 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1dac7576-3539-4e2b-bd24-048a9e8f267e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap632dc37f-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:e5:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411455, 'reachable_time': 25559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220459, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.303 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a25450a7-7591-4c49-9a2f-5fa5d6471acd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap632dc37f-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411464, 'tstamp': 411464}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220460, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap632dc37f-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411466, 'tstamp': 411466}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220460, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.305 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap632dc37f-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.306 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.310 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap632dc37f-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.310 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.310 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.310 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap632dc37f-a0, col_values=(('external_ids', {'iface-id': '13570b6a-d879-43dc-b830-8118569a82b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:07.310 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.413 183134 INFO nova.virt.libvirt.driver [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Instance destroyed successfully.#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.414 183134 DEBUG nova.objects.instance [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.432 183134 DEBUG nova.virt.libvirt.vif [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:33:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-gen-1-113243652',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ge',id=33,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-j5kq0qm1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:33:49Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=12406b2c-7c9c-41b8-b0c7-30bf4455b4a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.432 183134 DEBUG nova.network.os_vif_util [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "6b04d832-453a-4046-a311-7f401c10412f", "address": "fa:16:3e:9d:42:f4", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b04d832-45", "ovs_interfaceid": "6b04d832-453a-4046-a311-7f401c10412f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.433 183134 DEBUG nova.network.os_vif_util [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.434 183134 DEBUG os_vif [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.435 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.436 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b04d832-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.439 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.441 183134 INFO os_vif [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:42:f4,bridge_name='br-int',has_traffic_filtering=True,id=6b04d832-453a-4046-a311-7f401c10412f,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b04d832-45')#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.442 183134 INFO nova.virt.libvirt.driver [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Deleting instance files /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9_del#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.443 183134 INFO nova.virt.libvirt.driver [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Deletion of /var/lib/nova/instances/12406b2c-7c9c-41b8-b0c7-30bf4455b4a9_del complete#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.502 183134 INFO nova.compute.manager [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.502 183134 DEBUG oslo.service.loopingcall [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.503 183134 DEBUG nova.compute.manager [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.503 183134 DEBUG nova.network.neutron [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.636 183134 DEBUG nova.compute.manager [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-unplugged-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.636 183134 DEBUG oslo_concurrency.lockutils [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.636 183134 DEBUG oslo_concurrency.lockutils [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.637 183134 DEBUG oslo_concurrency.lockutils [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.637 183134 DEBUG nova.compute.manager [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] No waiting events found dispatching network-vif-unplugged-6b04d832-453a-4046-a311-7f401c10412f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:07 np0005601977 nova_compute[183130]: 2026-01-30 09:34:07.637 183134 DEBUG nova.compute.manager [req-63169f08-1ce7-475a-b574-6bf114026fcf req-1f7d1f16-0d61-41f5-83f8-26d961e8de5c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-unplugged-6b04d832-453a-4046-a311-7f401c10412f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.374 183134 DEBUG nova.network.neutron [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.412 183134 INFO nova.compute.manager [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.462 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.463 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.469 183134 DEBUG nova.compute.manager [req-6881a4b3-0c7e-4a76-a69c-00047ed0d979 req-c617ab2d-d27d-497c-a35c-3dad8fa7d602 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-deleted-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.653 183134 DEBUG nova.compute.provider_tree [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.669 183134 DEBUG nova.scheduler.client.report [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.689 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.711 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.716 183134 INFO nova.scheduler.client.report [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9#033[00m
Jan 30 04:34:08 np0005601977 nova_compute[183130]: 2026-01-30 09:34:08.781 183134 DEBUG oslo_concurrency.lockutils [None req-0b2ccb0a-dd7f-4152-b993-2820b5ca81a9 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.738 183134 DEBUG nova.compute.manager [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.739 183134 DEBUG oslo_concurrency.lockutils [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.739 183134 DEBUG oslo_concurrency.lockutils [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.739 183134 DEBUG oslo_concurrency.lockutils [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "12406b2c-7c9c-41b8-b0c7-30bf4455b4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.739 183134 DEBUG nova.compute.manager [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] No waiting events found dispatching network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:09 np0005601977 nova_compute[183130]: 2026-01-30 09:34:09.740 183134 WARNING nova.compute.manager [req-afd42227-12a9-47a4-9274-77d879405fc0 req-3e04bf0a-a0ad-4fbf-a319-cd4c6b59b7ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Received unexpected event network-vif-plugged-6b04d832-453a-4046-a311-7f401c10412f for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:09Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:34:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:09Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:de:f3 10.100.0.12
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.026 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.026 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.027 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.027 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.027 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.028 183134 INFO nova.compute.manager [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Terminating instance#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.029 183134 DEBUG nova.compute.manager [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:10 np0005601977 kernel: tap8472693d-cc (unregistering): left promiscuous mode
Jan 30 04:34:10 np0005601977 NetworkManager[55565]: <info>  [1769765650.0674] device (tap8472693d-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:10Z|00308|binding|INFO|Releasing lport 8472693d-cc3d-4223-b981-b7d1e9f96531 from this chassis (sb_readonly=0)
Jan 30 04:34:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:10Z|00309|binding|INFO|Setting lport 8472693d-cc3d-4223-b981-b7d1e9f96531 down in Southbound
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.071 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:10Z|00310|binding|INFO|Removing iface tap8472693d-cc ovn-installed in OVS
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.073 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.080 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b2:b6 10.100.0.4'], port_security=['fa:16:3e:87:b2:b6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aed146e3-865d-4aee-a055-42ed41e035c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-632dc37f-a471-48f7-998e-601c234d5eea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '396e2944b44f42e59b102db87e2e060c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4b89a168-ee83-4ac9-852d-dbd31b3e41f9 4c85f148-14e9-414e-82f7-3cd927a329dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3c73d5-3fb7-4892-bbfe-678dc6ae4603, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=8472693d-cc3d-4223-b981-b7d1e9f96531) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.082 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 8472693d-cc3d-4223-b981-b7d1e9f96531 in datapath 632dc37f-a471-48f7-998e-601c234d5eea unbound from our chassis#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.086 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 632dc37f-a471-48f7-998e-601c234d5eea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.090 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.090 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8f581cce-f93b-414d-80e6-314c54526917]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.091 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea namespace which is not needed anymore#033[00m
Jan 30 04:34:10 np0005601977 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 30 04:34:10 np0005601977 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Consumed 14.070s CPU time.
Jan 30 04:34:10 np0005601977 systemd-machined[154431]: Machine qemu-21-instance-0000001c terminated.
Jan 30 04:34:10 np0005601977 podman[220500]: 2026-01-30 09:34:10.162674148 +0000 UTC m=+0.073654189 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [NOTICE]   (219331) : haproxy version is 2.8.14-c23fe91
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [NOTICE]   (219331) : path to executable is /usr/sbin/haproxy
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [WARNING]  (219331) : Exiting Master process...
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [WARNING]  (219331) : Exiting Master process...
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [ALERT]    (219331) : Current worker (219333) exited with code 143 (Terminated)
Jan 30 04:34:10 np0005601977 neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea[219327]: [WARNING]  (219331) : All workers exited. Exiting... (0)
Jan 30 04:34:10 np0005601977 systemd[1]: libpod-e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8.scope: Deactivated successfully.
Jan 30 04:34:10 np0005601977 podman[220539]: 2026-01-30 09:34:10.209582934 +0000 UTC m=+0.040260525 container died e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:34:10 np0005601977 systemd[1]: var-lib-containers-storage-overlay-a0904f9e7d534fe9dc1ee87ea4033b77a2a99ec5a5d694dcc8d83411b849a25b-merged.mount: Deactivated successfully.
Jan 30 04:34:10 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8-userdata-shm.mount: Deactivated successfully.
Jan 30 04:34:10 np0005601977 podman[220539]: 2026-01-30 09:34:10.243552145 +0000 UTC m=+0.074229736 container cleanup e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:34:10 np0005601977 systemd[1]: libpod-conmon-e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8.scope: Deactivated successfully.
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.279 183134 INFO nova.virt.libvirt.driver [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Instance destroyed successfully.#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.280 183134 DEBUG nova.objects.instance [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lazy-loading 'resources' on Instance uuid aed146e3-865d-4aee-a055-42ed41e035c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.295 183134 DEBUG nova.virt.libvirt.vif [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2060529369-access_point-721263611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2060529369-ac',id=28,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHWc7+iju0ig/cYYtgMF7FccVXU/cVOvBYUArFUjtOyOMzdPSWipY4qhftKwG2kAT2FaeRfqftE1sruqmFqkCPVFpP923bzNJR9Cde3eohExOkgLh5N+aVAVzBeqt1QUXA==',key_name='tempest-TestSecurityGroupsBasicOps-1925360427',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='396e2944b44f42e59b102db87e2e060c',ramdisk_id='',reservation_id='r-k77iz630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2060529369',owner_user_name='tempest-TestSecurityGroupsBasicOps-2060529369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:33:09Z,user_data=None,user_id='594b0fd4bee7455ab5aac7774bd07b70',uuid=aed146e3-865d-4aee-a055-42ed41e035c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.296 183134 DEBUG nova.network.os_vif_util [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converting VIF {"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.297 183134 DEBUG nova.network.os_vif_util [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.297 183134 DEBUG os_vif [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.298 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.299 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8472693d-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:10 np0005601977 podman[220571]: 2026-01-30 09:34:10.306211576 +0000 UTC m=+0.040364317 container remove e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.318 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.320 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.321 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[531b2f30-41c6-4e1c-b2bc-4dd2cee0c5ce]: (4, ('Fri Jan 30 09:34:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea (e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8)\ne043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8\nFri Jan 30 09:34:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea (e043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8)\ne043f69c0ae36f729ac0e6728ccd301718592fb6381332335bd300d4a20c8ce8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.322 183134 INFO os_vif [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b2:b6,bridge_name='br-int',has_traffic_filtering=True,id=8472693d-cc3d-4223-b981-b7d1e9f96531,network=Network(632dc37f-a471-48f7-998e-601c234d5eea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8472693d-cc')#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.322 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[eacd6599-4f11-4233-94cf-66d6c1cd1e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.322 183134 INFO nova.virt.libvirt.driver [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Deleting instance files /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5_del#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.323 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap632dc37f-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.323 183134 INFO nova.virt.libvirt.driver [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Deletion of /var/lib/nova/instances/aed146e3-865d-4aee-a055-42ed41e035c5_del complete#033[00m
Jan 30 04:34:10 np0005601977 kernel: tap632dc37f-a0: left promiscuous mode
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.327 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.328 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.330 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[86249263-a485-4e24-a6b5-1c0efb698ae4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.351 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[715382b5-586e-4784-8804-2dcaafe48b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.352 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c540cc0f-6b17-4661-8197-4d1bf8e22cf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.362 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a11c9251-3384-4a62-b129-09394918607c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411449, 'reachable_time': 23462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220601, 'error': None, 'target': 'ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.364 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-632dc37f-a471-48f7-998e-601c234d5eea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:34:10 np0005601977 systemd[1]: run-netns-ovnmeta\x2d632dc37f\x2da471\x2d48f7\x2d998e\x2d601c234d5eea.mount: Deactivated successfully.
Jan 30 04:34:10 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:10.365 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[7c787ca9-d2a4-4cef-9692-1e7549600263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.375 183134 INFO nova.compute.manager [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.375 183134 DEBUG oslo.service.loopingcall [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.375 183134 DEBUG nova.compute.manager [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.376 183134 DEBUG nova.network.neutron [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.556 183134 DEBUG nova.compute.manager [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.557 183134 DEBUG nova.compute.manager [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing instance network info cache due to event network-changed-8472693d-cc3d-4223-b981-b7d1e9f96531. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.558 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.559 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.559 183134 DEBUG nova.network.neutron [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Refreshing network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:34:10 np0005601977 nova_compute[183130]: 2026-01-30 09:34:10.787 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:11 np0005601977 nova_compute[183130]: 2026-01-30 09:34:11.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:11 np0005601977 nova_compute[183130]: 2026-01-30 09:34:11.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:11 np0005601977 nova_compute[183130]: 2026-01-30 09:34:11.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:11 np0005601977 nova_compute[183130]: 2026-01-30 09:34:11.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.178 183134 DEBUG nova.network.neutron [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.201 183134 INFO nova.compute.manager [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Took 1.83 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.252 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.253 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.430 183134 DEBUG nova.compute.provider_tree [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.444 183134 DEBUG nova.scheduler.client.report [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.464 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.495 183134 INFO nova.scheduler.client.report [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Deleted allocations for instance aed146e3-865d-4aee-a055-42ed41e035c5#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.596 183134 DEBUG oslo_concurrency.lockutils [None req-f2c16ff2-8ee3-436e-b302-9cfec2b9b62a 594b0fd4bee7455ab5aac7774bd07b70 396e2944b44f42e59b102db87e2e060c - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.611 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.611 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.612 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.612 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.662 183134 DEBUG nova.compute.manager [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.662 183134 DEBUG oslo_concurrency.lockutils [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.662 183134 DEBUG oslo_concurrency.lockutils [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.663 183134 DEBUG oslo_concurrency.lockutils [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.663 183134 DEBUG nova.compute.manager [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] No waiting events found dispatching network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.663 183134 WARNING nova.compute.manager [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received unexpected event network-vif-plugged-8472693d-cc3d-4223-b981-b7d1e9f96531 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:12 np0005601977 nova_compute[183130]: 2026-01-30 09:34:12.664 183134 DEBUG nova.compute.manager [req-669d18f6-a370-4758-9bac-aaf2ed5d4409 req-27eecaf5-31e3-45f8-9f94-7050d2545a6b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-deleted-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.171 183134 DEBUG nova.network.neutron [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updated VIF entry in instance network info cache for port 8472693d-cc3d-4223-b981-b7d1e9f96531. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.172 183134 DEBUG nova.network.neutron [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Updating instance_info_cache with network_info: [{"id": "8472693d-cc3d-4223-b981-b7d1e9f96531", "address": "fa:16:3e:87:b2:b6", "network": {"id": "632dc37f-a471-48f7-998e-601c234d5eea", "bridge": "br-int", "label": "tempest-network-smoke--1563456286", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "396e2944b44f42e59b102db87e2e060c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8472693d-cc", "ovs_interfaceid": "8472693d-cc3d-4223-b981-b7d1e9f96531", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.192 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-aed146e3-865d-4aee-a055-42ed41e035c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.192 183134 DEBUG nova.compute.manager [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-unplugged-8472693d-cc3d-4223-b981-b7d1e9f96531 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.193 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.193 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.193 183134 DEBUG oslo_concurrency.lockutils [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "aed146e3-865d-4aee-a055-42ed41e035c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.194 183134 DEBUG nova.compute.manager [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] No waiting events found dispatching network-vif-unplugged-8472693d-cc3d-4223-b981-b7d1e9f96531 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:13 np0005601977 nova_compute[183130]: 2026-01-30 09:34:13.194 183134 DEBUG nova.compute.manager [req-4c1b71d0-baa5-4f13-83d8-a41759a4554a req-64469548-04aa-4a99-b105-2d5ec9dcb7fb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Received event network-vif-unplugged-8472693d-cc3d-4223-b981-b7d1e9f96531 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.346 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.347 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.347 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.347 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.347 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.348 183134 INFO nova.compute.manager [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Terminating instance#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.349 183134 DEBUG nova.compute.manager [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.354 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 kernel: tapa5afd5ba-13 (unregistering): left promiscuous mode
Jan 30 04:34:15 np0005601977 NetworkManager[55565]: <info>  [1769765655.3725] device (tapa5afd5ba-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:15Z|00311|binding|INFO|Releasing lport a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e from this chassis (sb_readonly=0)
Jan 30 04:34:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:15Z|00312|binding|INFO|Setting lport a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e down in Southbound
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.382 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:15Z|00313|binding|INFO|Removing iface tapa5afd5ba-13 ovn-installed in OVS
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.386 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.390 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d8:0d 10.100.0.18'], port_security=['fa:16:3e:c9:d8:0d 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '65c66677-23b6-479a-863f-3dd277183a7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed22b354-0eec-4dad-b9f9-3e87260fdb37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbe9ac69-dab6-405f-be15-dcf6f6e9930e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.392 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e in datapath 6c079c23-8031-4776-b9b7-153f2dd27fc7 unbound from our chassis#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.394 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c079c23-8031-4776-b9b7-153f2dd27fc7#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.395 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.407 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9029c306-27ef-40b2-8571-1df393f906cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.424 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[961ea529-d74b-4700-9cc8-dc1d115e346b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.427 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[763e1f01-9438-4ac4-b2e7-36b1d01dde3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 30 04:34:15 np0005601977 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Consumed 14.065s CPU time.
Jan 30 04:34:15 np0005601977 systemd-machined[154431]: Machine qemu-23-instance-0000001e terminated.
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.444 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[901870b3-b178-4c64-9995-61aea0f3cbf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.457 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[07ada828-9dcc-4b9d-a632-497e80362ea4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c079c23-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:96:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 742, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 742, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409728, 'reachable_time': 26163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220613, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.469 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2f04f444-89e8-4069-9ef2-ef54670cfa3f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap6c079c23-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409736, 'tstamp': 409736}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220614, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6c079c23-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409739, 'tstamp': 409739}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220614, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.471 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c079c23-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.473 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.477 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.477 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c079c23-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.478 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.479 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c079c23-80, col_values=(('external_ids', {'iface-id': 'ff915305-2000-4180-8452-99d99c6f677f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:15.479 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.562 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.566 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.595 183134 INFO nova.virt.libvirt.driver [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Instance destroyed successfully.#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.595 183134 DEBUG nova.objects.instance [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 65c66677-23b6-479a-863f-3dd277183a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.613 183134 DEBUG nova.virt.libvirt.vif [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-667740087',display_name='tempest-TestNetworkBasicOps-server-667740087',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-667740087',id=30,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKtearXovlRuDr42R5KGcjTwDUuQcl31zxpUy8NrHTEFSD+z4AYJc/VvwzKFMhyXdM5aphE+owZVs1dFxITTom8VCU4qittDL9ERuX8/wSOcgKppHIOemqZyBEgxc7eS3Q==',key_name='tempest-TestNetworkBasicOps-1937833633',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-yw2tqwjc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:33:12Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=65c66677-23b6-479a-863f-3dd277183a7d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.613 183134 DEBUG nova.network.os_vif_util [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "address": "fa:16:3e:c9:d8:0d", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5afd5ba-13", "ovs_interfaceid": "a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.614 183134 DEBUG nova.network.os_vif_util [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.614 183134 DEBUG os_vif [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.617 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.617 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5afd5ba-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.618 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.620 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.622 183134 INFO os_vif [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d8:0d,bridge_name='br-int',has_traffic_filtering=True,id=a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5afd5ba-13')#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.622 183134 INFO nova.virt.libvirt.driver [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Deleting instance files /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d_del#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.623 183134 INFO nova.virt.libvirt.driver [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Deletion of /var/lib/nova/instances/65c66677-23b6-479a-863f-3dd277183a7d_del complete#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.679 183134 INFO nova.compute.manager [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.679 183134 DEBUG oslo.service.loopingcall [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.680 183134 DEBUG nova.compute.manager [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.680 183134 DEBUG nova.network.neutron [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.684 183134 DEBUG nova.compute.manager [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-unplugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.684 183134 DEBUG oslo_concurrency.lockutils [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.685 183134 DEBUG oslo_concurrency.lockutils [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.685 183134 DEBUG oslo_concurrency.lockutils [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.685 183134 DEBUG nova.compute.manager [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] No waiting events found dispatching network-vif-unplugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.685 183134 DEBUG nova.compute.manager [req-89cd61b5-c20f-4d36-8ff6-274f6a11d3a9 req-c83714e6-044c-4bc0-9137-8938181dfbaf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-unplugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.709 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.732 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.733 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:34:15 np0005601977 nova_compute[183130]: 2026-01-30 09:34:15.791 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.105 183134 INFO nova.compute.manager [None req-cfc650ca-2015-4bb7-b9b3-c882b841abf6 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Get console output#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.112 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.316 183134 DEBUG nova.network.neutron [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.338 183134 INFO nova.compute.manager [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Took 0.66 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.386 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.387 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.528 183134 DEBUG nova.compute.provider_tree [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.558 183134 DEBUG nova.scheduler.client.report [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.587 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.625 183134 INFO nova.scheduler.client.report [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 65c66677-23b6-479a-863f-3dd277183a7d#033[00m
Jan 30 04:34:16 np0005601977 nova_compute[183130]: 2026-01-30 09:34:16.691 183134 DEBUG oslo_concurrency.lockutils [None req-6e95a49c-136f-4d20-89a2-ab426060280a a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00314|binding|INFO|Releasing lport ff915305-2000-4180-8452-99d99c6f677f from this chassis (sb_readonly=0)
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00315|binding|INFO|Releasing lport 50e26df2-7d93-4204-9b22-94b2140c0f47 from this chassis (sb_readonly=0)
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00316|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00317|binding|INFO|Releasing lport 92996e6c-be8d-4868-a92b-0dd619c09c89 from this chassis (sb_readonly=0)
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.426 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.427 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.428 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.428 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.429 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.430 183134 INFO nova.compute.manager [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Terminating instance#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.431 183134 DEBUG nova.compute.manager [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.438 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 kernel: tap747cab40-fb (unregistering): left promiscuous mode
Jan 30 04:34:17 np0005601977 NetworkManager[55565]: <info>  [1769765657.4575] device (tap747cab40-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00318|binding|INFO|Releasing lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 from this chassis (sb_readonly=0)
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00319|binding|INFO|Setting lport 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 down in Southbound
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.462 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:17Z|00320|binding|INFO|Removing iface tap747cab40-fb ovn-installed in OVS
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.464 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.474 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.497 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:de:f3 10.100.0.12'], port_security=['fa:16:3e:99:de:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0e693c72-183a-4005-8891-207b95ad22b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '90463a7d-c3a0-4624-975d-0cc4b6ff9814', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a98fd02a-19ea-434b-9ec2-1fdf64f82e5f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=747cab40-fbad-4008-a7ac-6cf1f12b6ee4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.499 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 in datapath b2ca1571-8ba0-4f98-bb63-cbd6ba450393 unbound from our chassis#033[00m
Jan 30 04:34:17 np0005601977 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 30 04:34:17 np0005601977 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000001f.scope: Consumed 12.534s CPU time.
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.502 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2ca1571-8ba0-4f98-bb63-cbd6ba450393, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.503 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8e133df9-d1ca-4e0e-89b8-e29b38125c46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.504 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 namespace which is not needed anymore#033[00m
Jan 30 04:34:17 np0005601977 systemd-machined[154431]: Machine qemu-27-instance-0000001f terminated.
Jan 30 04:34:17 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [NOTICE]   (220343) : haproxy version is 2.8.14-c23fe91
Jan 30 04:34:17 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [NOTICE]   (220343) : path to executable is /usr/sbin/haproxy
Jan 30 04:34:17 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [WARNING]  (220343) : Exiting Master process...
Jan 30 04:34:17 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [ALERT]    (220343) : Current worker (220345) exited with code 143 (Terminated)
Jan 30 04:34:17 np0005601977 neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393[220339]: [WARNING]  (220343) : All workers exited. Exiting... (0)
Jan 30 04:34:17 np0005601977 systemd[1]: libpod-8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628.scope: Deactivated successfully.
Jan 30 04:34:17 np0005601977 podman[220654]: 2026-01-30 09:34:17.641466007 +0000 UTC m=+0.043340553 container died 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:34:17 np0005601977 systemd[1]: var-lib-containers-storage-overlay-ac46e4207006103a28ccdcdc19d9524c9dabf760bc0ed83292390ac0698dfb56-merged.mount: Deactivated successfully.
Jan 30 04:34:17 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628-userdata-shm.mount: Deactivated successfully.
Jan 30 04:34:17 np0005601977 podman[220654]: 2026-01-30 09:34:17.677029635 +0000 UTC m=+0.078904181 container cleanup 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.698 183134 INFO nova.virt.libvirt.driver [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Instance destroyed successfully.#033[00m
Jan 30 04:34:17 np0005601977 systemd[1]: libpod-conmon-8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628.scope: Deactivated successfully.
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.699 183134 DEBUG nova.objects.instance [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 0e693c72-183a-4005-8891-207b95ad22b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.751 183134 DEBUG nova.virt.libvirt.vif [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-30T09:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-207746804',display_name='tempest-TestNetworkAdvancedServerOps-server-207746804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-207746804',id=31,image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOHN1UvdVdxjRqWodp0BMNbZ+GuyrulD0WI5KcbScYGQgPQB4wl/ZaktEG5xr0Om9ojhk6Hzy9SxfALCy1xa8KSr75yE8ZE1A0eo/1WyunUzyt9Blwa2sI8tAidj85d5Hw==',key_name='tempest-TestNetworkAdvancedServerOps-1735414248',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-mw7kacnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='2eb3f7a8-d1f2-41d5-9e16-cec6cebdde74',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:33:58Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=0e693c72-183a-4005-8891-207b95ad22b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.751 183134 DEBUG nova.network.os_vif_util [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.752 183134 DEBUG nova.network.os_vif_util [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:17 np0005601977 podman[220697]: 2026-01-30 09:34:17.752509056 +0000 UTC m=+0.044924459 container remove 8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.752 183134 DEBUG os_vif [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.754 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.754 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap747cab40-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.755 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.757 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.757 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a78f3c1a-5259-4eeb-b6ad-7614aa6d41c9]: (4, ('Fri Jan 30 09:34:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 (8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628)\n8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628\nFri Jan 30 09:34:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 (8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628)\n8ad653252dd954adbedf40ed33febaf9cfecf54ba6521d2a4e6c0643eebde628\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.759 183134 INFO os_vif [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:de:f3,bridge_name='br-int',has_traffic_filtering=True,id=747cab40-fbad-4008-a7ac-6cf1f12b6ee4,network=Network(b2ca1571-8ba0-4f98-bb63-cbd6ba450393),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap747cab40-fb')#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.759 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f4be04e9-093c-4796-861f-0e23e11071cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.760 183134 INFO nova.virt.libvirt.driver [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Deleting instance files /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1_del#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.760 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ca1571-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.760 183134 INFO nova.virt.libvirt.driver [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Deletion of /var/lib/nova/instances/0e693c72-183a-4005-8891-207b95ad22b1_del complete#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.762 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 kernel: tapb2ca1571-80: left promiscuous mode
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.767 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.769 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.771 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a4cf401b-1dd7-44b3-a17a-9cdc4b65dab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.786 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1af4d58b-d9dc-42af-8456-f928c5e1470c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.788 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f201dd7b-1610-485b-91f8-1ddead17c784]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.805 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1bcd68-b573-40cf-ba58-864207947c3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416874, 'reachable_time': 37608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220713, 'error': None, 'target': 'ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.808 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2ca1571-8ba0-4f98-bb63-cbd6ba450393 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:34:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:17.808 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[6090fdba-0d0b-4e16-8497-999f9a614ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:17 np0005601977 systemd[1]: run-netns-ovnmeta\x2db2ca1571\x2d8ba0\x2d4f98\x2dbb63\x2dcbd6ba450393.mount: Deactivated successfully.
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.833 183134 INFO nova.compute.manager [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.834 183134 DEBUG oslo.service.loopingcall [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.834 183134 DEBUG nova.compute.manager [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.834 183134 DEBUG nova.network.neutron [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.873 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-3e8e7ac3-7773-46da-922a-c24dce47f456" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.873 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-3e8e7ac3-7773-46da-922a-c24dce47f456" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.933 183134 DEBUG nova.objects.instance [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'flavor' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.954 183134 DEBUG nova.compute.manager [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.955 183134 DEBUG oslo_concurrency.lockutils [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.955 183134 DEBUG oslo_concurrency.lockutils [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.956 183134 DEBUG oslo_concurrency.lockutils [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.956 183134 DEBUG nova.compute.manager [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.957 183134 DEBUG nova.compute.manager [req-401ca557-cdce-45c1-b816-91e10ff5ad94 req-9cc824d1-3448-4fe5-aead-9da5c4537c75 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-unplugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.982 183134 DEBUG nova.virt.libvirt.vif [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.983 183134 DEBUG nova.network.os_vif_util [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.984 183134 DEBUG nova.network.os_vif_util [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.988 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.991 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.995 183134 DEBUG nova.virt.libvirt.driver [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Attempting to detach device tap3e8e7ac3-77 from instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 30 04:34:17 np0005601977 nova_compute[183130]: 2026-01-30 09:34:17.996 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] detach device xml: <interface type="ethernet">
Jan 30 04:34:17 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:e7:3b:1b"/>
Jan 30 04:34:17 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:34:17 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:34:17 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:34:17 np0005601977 nova_compute[183130]:  <target dev="tap3e8e7ac3-77"/>
Jan 30 04:34:17 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:34:17 np0005601977 nova_compute[183130]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.003 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.007 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface>not found in domain: <domain type='kvm' id='19'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <name>instance-0000001b</name>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <uuid>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</uuid>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:32:45</nova:creationTime>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:port uuid="3e8e7ac3-7773-46da-922a-c24dce47f456">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='serial'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='uuid'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk' index='2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config' index='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:ac:3e:b3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='tapf469de0f-e3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:e7:3b:1b'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='tap3e8e7ac3-77'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='net1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/3'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c275,c311</label>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c275,c311</imagelabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.007 183134 INFO nova.virt.libvirt.driver [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully detached device tap3e8e7ac3-77 from instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f from the persistent domain config.#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.007 183134 DEBUG nova.virt.libvirt.driver [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] (1/8): Attempting to detach device tap3e8e7ac3-77 with device alias net1 from instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.008 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] detach device xml: <interface type="ethernet">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <mac address="fa:16:3e:e7:3b:1b"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <model type="virtio"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <mtu size="1442"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <target dev="tap3e8e7ac3-77"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </interface>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.073 183134 DEBUG nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.074 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "65c66677-23b6-479a-863f-3dd277183a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.074 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.074 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "65c66677-23b6-479a-863f-3dd277183a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.075 183134 DEBUG nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] No waiting events found dispatching network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.075 183134 WARNING nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received unexpected event network-vif-plugged-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.075 183134 DEBUG nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Received event network-vif-deleted-a5afd5ba-13e1-4ebd-a2c3-bf2bee49f68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.075 183134 DEBUG nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.076 183134 DEBUG nova.compute.manager [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing instance network info cache due to event network-changed-747cab40-fbad-4008-a7ac-6cf1f12b6ee4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.076 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.076 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.076 183134 DEBUG nova.network.neutron [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Refreshing network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:34:18 np0005601977 kernel: tap3e8e7ac3-77 (unregistering): left promiscuous mode
Jan 30 04:34:18 np0005601977 NetworkManager[55565]: <info>  [1769765658.1155] device (tap3e8e7ac3-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:18Z|00321|binding|INFO|Releasing lport 3e8e7ac3-7773-46da-922a-c24dce47f456 from this chassis (sb_readonly=0)
Jan 30 04:34:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:18Z|00322|binding|INFO|Setting lport 3e8e7ac3-7773-46da-922a-c24dce47f456 down in Southbound
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.125 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:18Z|00323|binding|INFO|Removing iface tap3e8e7ac3-77 ovn-installed in OVS
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.128 183134 DEBUG nova.virt.libvirt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Received event <DeviceRemovedEvent: 1769765658.128462, 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.129 183134 DEBUG nova.virt.libvirt.driver [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Start waiting for the detach event from libvirt for device tap3e8e7ac3-77 with device alias net1 for instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.130 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.134 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface>not found in domain: <domain type='kvm' id='19'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <name>instance-0000001b</name>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <uuid>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</uuid>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:32:45</nova:creationTime>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:port uuid="3e8e7ac3-7773-46da-922a-c24dce47f456">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='serial'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='uuid'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk' index='2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config' index='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:ac:3e:b3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target dev='tapf469de0f-e3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/3'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c275,c311</label>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c275,c311</imagelabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.134 183134 INFO nova.virt.libvirt.driver [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully detached device tap3e8e7ac3-77 from instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f from the live domain config.#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.135 183134 DEBUG nova.virt.libvirt.vif [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.135 183134 DEBUG nova.network.os_vif_util [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.136 183134 DEBUG nova.network.os_vif_util [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.136 183134 DEBUG os_vif [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.138 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.138 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e8e7ac3-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.140 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.142 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.145 183134 INFO os_vif [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77')#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.146 183134 DEBUG nova.virt.libvirt.guest [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:34:18</nova:creationTime>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:18 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:18 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:18 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.207 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:3b:1b 10.100.0.20', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbe9ac69-dab6-405f-be15-dcf6f6e9930e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=3e8e7ac3-7773-46da-922a-c24dce47f456) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.208 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 3e8e7ac3-7773-46da-922a-c24dce47f456 in datapath 6c079c23-8031-4776-b9b7-153f2dd27fc7 unbound from our chassis#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.210 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c079c23-8031-4776-b9b7-153f2dd27fc7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.211 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[607b5680-6f49-46ca-b6c7-9a356bae8c58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.212 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7 namespace which is not needed anymore#033[00m
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [NOTICE]   (218959) : haproxy version is 2.8.14-c23fe91
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [NOTICE]   (218959) : path to executable is /usr/sbin/haproxy
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [WARNING]  (218959) : Exiting Master process...
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [WARNING]  (218959) : Exiting Master process...
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [ALERT]    (218959) : Current worker (218961) exited with code 143 (Terminated)
Jan 30 04:34:18 np0005601977 neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7[218955]: [WARNING]  (218959) : All workers exited. Exiting... (0)
Jan 30 04:34:18 np0005601977 systemd[1]: libpod-b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c.scope: Deactivated successfully.
Jan 30 04:34:18 np0005601977 podman[220734]: 2026-01-30 09:34:18.349996982 +0000 UTC m=+0.047704280 container died b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:34:18 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c-userdata-shm.mount: Deactivated successfully.
Jan 30 04:34:18 np0005601977 systemd[1]: var-lib-containers-storage-overlay-a223d6f71021fcb7d14385395842d22a96185d2fa64ffb92854e08f2047da1df-merged.mount: Deactivated successfully.
Jan 30 04:34:18 np0005601977 podman[220734]: 2026-01-30 09:34:18.381637926 +0000 UTC m=+0.079345184 container cleanup b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 30 04:34:18 np0005601977 systemd[1]: libpod-conmon-b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c.scope: Deactivated successfully.
Jan 30 04:34:18 np0005601977 podman[220762]: 2026-01-30 09:34:18.437943963 +0000 UTC m=+0.039781120 container remove b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.441 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[969cf64d-d733-47f2-a9bf-e92acfc5432c]: (4, ('Fri Jan 30 09:34:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7 (b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c)\nb3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c\nFri Jan 30 09:34:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7 (b3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c)\nb3382297519658a938cceb2c59f40aa2b114dc3e2e83a45952de88e1afe3318c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.442 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9e9fde-7f47-499f-adc1-5681e6947e66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.443 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c079c23-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.445 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 kernel: tap6c079c23-80: left promiscuous mode
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.449 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.450 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.452 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4c30da80-03b7-4076-9269-bbd3d64a907f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.471 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa643b2-aa86-44cf-b04b-f9709ae8ff2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.473 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[78cc9a02-a5ab-4f51-a6f2-14f91a74c1fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.494 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b6cf3cd7-cb37-4dfa-b5f0-51de98a6b5e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409723, 'reachable_time': 15345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220781, 'error': None, 'target': 'ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.496 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c079c23-8031-4776-b9b7-153f2dd27fc7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:34:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:18.496 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[35328aea-4fd2-4cb3-bef1-0186d2c39283]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:18 np0005601977 systemd[1]: run-netns-ovnmeta\x2d6c079c23\x2d8031\x2d4776\x2db9b7\x2d153f2dd27fc7.mount: Deactivated successfully.
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.727 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.842 183134 DEBUG nova.network.neutron [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.885 183134 INFO nova.compute.manager [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Took 1.05 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.952 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:18 np0005601977 nova_compute[183130]: 2026-01-30 09:34:18.953 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.107 183134 DEBUG nova.compute.provider_tree [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.129 183134 DEBUG nova.scheduler.client.report [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.174 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.217 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.218 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.219 183134 DEBUG nova.network.neutron [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.222 183134 INFO nova.scheduler.client.report [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 0e693c72-183a-4005-8891-207b95ad22b1#033[00m
Jan 30 04:34:19 np0005601977 nova_compute[183130]: 2026-01-30 09:34:19.581 183134 DEBUG oslo_concurrency.lockutils [None req-a3670948-52f9-47b5-88f8-5adff2ed88f9 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:19 np0005601977 podman[220784]: 2026-01-30 09:34:19.876936146 +0000 UTC m=+0.092059761 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 30 04:34:19 np0005601977 podman[220783]: 2026-01-30 09:34:19.877290556 +0000 UTC m=+0.092680929 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., release=1769056855, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, container_name=openstack_network_exporter)
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.163 183134 DEBUG nova.compute.manager [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.163 183134 DEBUG oslo_concurrency.lockutils [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "0e693c72-183a-4005-8891-207b95ad22b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.164 183134 DEBUG oslo_concurrency.lockutils [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.165 183134 DEBUG oslo_concurrency.lockutils [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "0e693c72-183a-4005-8891-207b95ad22b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.165 183134 DEBUG nova.compute.manager [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] No waiting events found dispatching network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.166 183134 WARNING nova.compute.manager [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received unexpected event network-vif-plugged-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.166 183134 DEBUG nova.compute.manager [req-d00a4c06-69fd-4fdc-a1ca-09f5c4af2e2c req-7d76d491-1255-4f17-a979-87cc6053043e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Received event network-vif-deleted-747cab40-fbad-4008-a7ac-6cf1f12b6ee4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.336 183134 DEBUG nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-unplugged-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.337 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.337 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.338 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.338 183134 DEBUG nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-unplugged-3e8e7ac3-7773-46da-922a-c24dce47f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.339 183134 WARNING nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-unplugged-3e8e7ac3-7773-46da-922a-c24dce47f456 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.339 183134 DEBUG nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.340 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.340 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.340 183134 DEBUG oslo_concurrency.lockutils [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.341 183134 DEBUG nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.341 183134 WARNING nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-3e8e7ac3-7773-46da-922a-c24dce47f456 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.341 183134 DEBUG nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-deleted-3e8e7ac3-7773-46da-922a-c24dce47f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.342 183134 INFO nova.compute.manager [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Neutron deleted interface 3e8e7ac3-7773-46da-922a-c24dce47f456; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.342 183134 DEBUG nova.network.neutron [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.372 183134 DEBUG nova.objects.instance [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lazy-loading 'system_metadata' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.792 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.838 183134 DEBUG nova.objects.instance [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lazy-loading 'flavor' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.874 183134 DEBUG nova.virt.libvirt.vif [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.874 183134 DEBUG nova.network.os_vif_util [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.875 183134 DEBUG nova.network.os_vif_util [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.879 183134 DEBUG nova.virt.libvirt.guest [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.882 183134 DEBUG nova.virt.libvirt.guest [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface>not found in domain: <domain type='kvm' id='19'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <name>instance-0000001b</name>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <uuid>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</uuid>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:34:18</nova:creationTime>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='serial'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='uuid'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk' index='2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config' index='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:ac:3e:b3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='tapf469de0f-e3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/3'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c275,c311</label>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c275,c311</imagelabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.883 183134 DEBUG nova.virt.libvirt.guest [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.887 183134 DEBUG nova.virt.libvirt.guest [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:e7:3b:1b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e8e7ac3-77"/></interface>not found in domain: <domain type='kvm' id='19'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <name>instance-0000001b</name>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <uuid>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</uuid>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:34:18</nova:creationTime>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <memory unit='KiB'>131072</memory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <vcpu placement='static'>1</vcpu>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <resource>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <partition>/machine</partition>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </resource>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <sysinfo type='smbios'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='manufacturer'>RDO</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='product'>OpenStack Compute</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='serial'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='uuid'>18ac4790-626b-4d8b-9ba9-34f94dfa7a3f</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <entry name='family'>Virtual Machine</entry>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <boot dev='hd'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <smbios mode='sysinfo'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <vmcoreinfo state='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <cpu mode='custom' match='exact' check='full'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <model fallback='forbid'>Nehalem</model>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='x2apic'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='hypervisor'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <feature policy='require' name='vme'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <clock offset='utc'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='pit' tickpolicy='delay'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <timer name='hpet' present='no'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_poweroff>destroy</on_poweroff>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_reboot>restart</on_reboot>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <on_crash>destroy</on_crash>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <disk type='file' device='disk'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk' index='2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backingStore type='file' index='3'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <format type='raw'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <source file='/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <backingStore/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      </backingStore>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='vda' bus='virtio'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='virtio-disk0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <disk type='file' device='cdrom'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='qemu' type='raw' cache='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/disk.config' index='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backingStore/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='sda' bus='sata'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <readonly/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='sata0-0-0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='0' model='pcie-root'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pcie.0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='1' port='0x10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='2' port='0x11'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='3' port='0x12'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='4' port='0x13'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='5' port='0x14'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='6' port='0x15'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='7' port='0x16'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='8' port='0x17'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.8'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='9' port='0x18'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.9'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='10' port='0x19'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='11' port='0x1a'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.11'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='12' port='0x1b'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.12'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='13' port='0x1c'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.13'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='14' port='0x1d'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.14'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='15' port='0x1e'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.15'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='16' port='0x1f'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.16'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='17' port='0x20'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.17'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='18' port='0x21'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.18'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='19' port='0x22'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.19'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='20' port='0x23'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.20'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='21' port='0x24'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.21'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='22' port='0x25'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.22'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='23' port='0x26'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.23'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='24' port='0x27'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.24'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-root-port'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target chassis='25' port='0x28'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.25'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model name='pcie-pci-bridge'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='pci.26'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='usb'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <controller type='sata' index='0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='ide'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </controller>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <interface type='ethernet'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <mac address='fa:16:3e:ac:3e:b3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target dev='tapf469de0f-e3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model type='virtio'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <driver name='vhost' rx_queue_size='512'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <mtu size='1442'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='net0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <serial type='pty'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target type='isa-serial' port='0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:        <model name='isa-serial'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      </target>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <console type='pty' tty='/dev/pts/3'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <source path='/dev/pts/3'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <log file='/var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f/console.log' append='off'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <target type='serial' port='0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='serial0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </console>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='tablet' bus='usb'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='usb' bus='0' port='1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='mouse' bus='ps2'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input1'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <input type='keyboard' bus='ps2'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='input2'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </input>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <listen type='address' address='::0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </graphics>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <audio id='1' type='none'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <model type='virtio' heads='1' primary='yes'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='video0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <watchdog model='itco' action='reset'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='watchdog0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </watchdog>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <memballoon model='virtio'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <stats period='10'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='balloon0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <rng model='virtio'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <backend model='random'>/dev/urandom</backend>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <alias name='rng0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <label>system_u:system_r:svirt_t:s0:c275,c311</label>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c275,c311</imagelabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <label>+107:+107</label>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <imagelabel>+107:+107</imagelabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </seclabel>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.887 183134 WARNING nova.virt.libvirt.driver [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Detaching interface fa:16:3e:e7:3b:1b failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap3e8e7ac3-77' not found.#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.888 183134 DEBUG nova.virt.libvirt.vif [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.888 183134 DEBUG nova.network.os_vif_util [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converting VIF {"id": "3e8e7ac3-7773-46da-922a-c24dce47f456", "address": "fa:16:3e:e7:3b:1b", "network": {"id": "6c079c23-8031-4776-b9b7-153f2dd27fc7", "bridge": "br-int", "label": "tempest-network-smoke--1439240769", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e8e7ac3-77", "ovs_interfaceid": "3e8e7ac3-7773-46da-922a-c24dce47f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.889 183134 DEBUG nova.network.os_vif_util [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.889 183134 DEBUG os_vif [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.890 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.890 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e8e7ac3-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.891 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.892 183134 INFO os_vif [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3b:1b,bridge_name='br-int',has_traffic_filtering=True,id=3e8e7ac3-7773-46da-922a-c24dce47f456,network=Network(6c079c23-8031-4776-b9b7-153f2dd27fc7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e8e7ac3-77')#033[00m
Jan 30 04:34:20 np0005601977 nova_compute[183130]: 2026-01-30 09:34:20.893 183134 DEBUG nova.virt.libvirt.guest [req-b54b38a2-bbc4-4735-b1b0-ec0bd6810bce req-32aeb678-9fbd-4cb2-adaa-997159aa7f15 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:name>tempest-TestNetworkBasicOps-server-889211547</nova:name>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:creationTime>2026-01-30 09:34:20</nova:creationTime>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:flavor name="m1.nano">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:memory>128</nova:memory>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:disk>1</nova:disk>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:swap>0</nova:swap>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:vcpus>1</nova:vcpus>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:flavor>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:owner>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  <nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    <nova:port uuid="f469de0f-e330-4b6b-853b-397301173e4e">
Jan 30 04:34:20 np0005601977 nova_compute[183130]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:    </nova:port>
Jan 30 04:34:20 np0005601977 nova_compute[183130]:  </nova:ports>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: </nova:instance>
Jan 30 04:34:20 np0005601977 nova_compute[183130]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.369 183134 DEBUG nova.network.neutron [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updated VIF entry in instance network info cache for port 747cab40-fbad-4008-a7ac-6cf1f12b6ee4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.370 183134 DEBUG nova.network.neutron [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Updating instance_info_cache with network_info: [{"id": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "address": "fa:16:3e:99:de:f3", "network": {"id": "b2ca1571-8ba0-4f98-bb63-cbd6ba450393", "bridge": "br-int", "label": "tempest-network-smoke--882207796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap747cab40-fb", "ovs_interfaceid": "747cab40-fbad-4008-a7ac-6cf1f12b6ee4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.396 183134 DEBUG oslo_concurrency.lockutils [req-bbffe4bf-fe07-460f-9a6b-8ece9f8d458b req-1addc980-d99f-4328-9a4b-2dfd4db8c65b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-0e693c72-183a-4005-8891-207b95ad22b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.411 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765647.4109223, 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.411 183134 INFO nova.compute.manager [-] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.435 183134 DEBUG nova.compute.manager [None req-d3d1051b-26a6-4f7d-86f2-33e3456af296 - - - - - -] [instance: 12406b2c-7c9c-41b8-b0c7-30bf4455b4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.470 183134 DEBUG nova.compute.manager [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.550 183134 INFO nova.compute.manager [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] instance snapshotting#033[00m
Jan 30 04:34:22 np0005601977 nova_compute[183130]: 2026-01-30 09:34:22.888 183134 INFO nova.virt.libvirt.driver [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Beginning live snapshot process#033[00m
Jan 30 04:34:23 np0005601977 virtqemud[182587]: invalid argument: disk vda does not have an active block job
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.043 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.120 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json -f qcow2" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.121 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.141 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.184 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.200 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.277 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.278 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.306 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572.delta 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.308 183134 INFO nova.virt.libvirt.driver [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 30 04:34:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:23Z|00324|binding|INFO|Releasing lport 50e26df2-7d93-4204-9b22-94b2140c0f47 from this chassis (sb_readonly=0)
Jan 30 04:34:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:23Z|00325|binding|INFO|Releasing lport afb82ca4-9bbd-4c23-b82a-439171c628d6 from this chassis (sb_readonly=0)
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.348 183134 DEBUG nova.virt.libvirt.guest [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] COPY block job progress, current cursor: 0 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.382 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.733 183134 INFO nova.network.neutron [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Port 3e8e7ac3-7773-46da-922a-c24dce47f456 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.734 183134 DEBUG nova.network.neutron [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.756 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.805 183134 DEBUG oslo_concurrency.lockutils [None req-506910a8-40fe-41ac-aa56-35ed5ca73b18 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "interface-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-3e8e7ac3-7773-46da-922a-c24dce47f456" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.852 183134 DEBUG nova.virt.libvirt.guest [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] COPY block job progress, current cursor: 1048576 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.856 183134 INFO nova.virt.libvirt.driver [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.901 183134 DEBUG nova.privsep.utils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:34:23 np0005601977 nova_compute[183130]: 2026-01-30 09:34:23.902 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572.delta /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.286 183134 DEBUG oslo_concurrency.processutils [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572.delta /var/lib/nova/instances/snapshots/tmp6r89ze88/6ad1b08ddabb4f7a841940360f07e572" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.287 183134 INFO nova.virt.libvirt.driver [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Snapshot extracted, beginning image upload#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.420 183134 DEBUG nova.compute.manager [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-changed-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.420 183134 DEBUG nova.compute.manager [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing instance network info cache due to event network-changed-f469de0f-e330-4b6b-853b-397301173e4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.421 183134 DEBUG oslo_concurrency.lockutils [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.421 183134 DEBUG oslo_concurrency.lockutils [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.421 183134 DEBUG nova.network.neutron [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Refreshing network info cache for port f469de0f-e330-4b6b-853b-397301173e4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.512 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.513 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.513 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.513 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.514 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.515 183134 INFO nova.compute.manager [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Terminating instance#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.516 183134 DEBUG nova.compute.manager [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:24 np0005601977 kernel: tapf469de0f-e3 (unregistering): left promiscuous mode
Jan 30 04:34:24 np0005601977 NetworkManager[55565]: <info>  [1769765664.5464] device (tapf469de0f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00326|binding|INFO|Releasing lport f469de0f-e330-4b6b-853b-397301173e4e from this chassis (sb_readonly=0)
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00327|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e down in Southbound
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.551 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00328|binding|INFO|Removing iface tapf469de0f-e3 ovn-installed in OVS
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.558 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.561 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:3e:b3 10.100.0.10'], port_security=['fa:16:3e:ac:3e:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '465bb202-1df3-4c6e-82e9-19a120fe9790', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11cd1bd5-e27d-4fc7-95b6-d09dd95ff43a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f469de0f-e330-4b6b-853b-397301173e4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.562 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f469de0f-e330-4b6b-853b-397301173e4e in datapath 408e9205-54bc-4c8e-9fe0-c3c49be6610d unbound from our chassis#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.565 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 408e9205-54bc-4c8e-9fe0-c3c49be6610d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.566 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[87243791-ffde-4688-9a8b-21708b11780d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.567 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d namespace which is not needed anymore#033[00m
Jan 30 04:34:24 np0005601977 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 30 04:34:24 np0005601977 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Consumed 18.048s CPU time.
Jan 30 04:34:24 np0005601977 systemd-machined[154431]: Machine qemu-19-instance-0000001b terminated.
Jan 30 04:34:24 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [NOTICE]   (218356) : haproxy version is 2.8.14-c23fe91
Jan 30 04:34:24 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [NOTICE]   (218356) : path to executable is /usr/sbin/haproxy
Jan 30 04:34:24 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [WARNING]  (218356) : Exiting Master process...
Jan 30 04:34:24 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [ALERT]    (218356) : Current worker (218358) exited with code 143 (Terminated)
Jan 30 04:34:24 np0005601977 neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d[218352]: [WARNING]  (218356) : All workers exited. Exiting... (0)
Jan 30 04:34:24 np0005601977 systemd[1]: libpod-9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66.scope: Deactivated successfully.
Jan 30 04:34:24 np0005601977 podman[220875]: 2026-01-30 09:34:24.710733459 +0000 UTC m=+0.050868401 container died 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:34:24 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66-userdata-shm.mount: Deactivated successfully.
Jan 30 04:34:24 np0005601977 systemd[1]: var-lib-containers-storage-overlay-b4eb72d0412f2183f8b39edf5929bad799674d373be5105ec9ba0c5dae9a8b1a-merged.mount: Deactivated successfully.
Jan 30 04:34:24 np0005601977 kernel: tapf469de0f-e3: entered promiscuous mode
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.740 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 podman[220875]: 2026-01-30 09:34:24.740674605 +0000 UTC m=+0.080809547 container cleanup 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00329|binding|INFO|Claiming lport f469de0f-e330-4b6b-853b-397301173e4e for this chassis.
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00330|binding|INFO|f469de0f-e330-4b6b-853b-397301173e4e: Claiming fa:16:3e:ac:3e:b3 10.100.0.10
Jan 30 04:34:24 np0005601977 kernel: tapf469de0f-e3 (unregistering): left promiscuous mode
Jan 30 04:34:24 np0005601977 NetworkManager[55565]: <info>  [1769765664.7453] manager: (tapf469de0f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.752 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:3e:b3 10.100.0.10'], port_security=['fa:16:3e:ac:3e:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '465bb202-1df3-4c6e-82e9-19a120fe9790', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11cd1bd5-e27d-4fc7-95b6-d09dd95ff43a, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f469de0f-e330-4b6b-853b-397301173e4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00331|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e ovn-installed in OVS
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00332|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e up in Southbound
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.756 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00333|binding|INFO|Releasing lport f469de0f-e330-4b6b-853b-397301173e4e from this chassis (sb_readonly=1)
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00334|binding|INFO|Removing iface tapf469de0f-e3 ovn-installed in OVS
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00335|if_status|INFO|Not setting lport f469de0f-e330-4b6b-853b-397301173e4e down as sb is readonly
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.759 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00336|binding|INFO|Releasing lport f469de0f-e330-4b6b-853b-397301173e4e from this chassis (sb_readonly=0)
Jan 30 04:34:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:24Z|00337|binding|INFO|Setting lport f469de0f-e330-4b6b-853b-397301173e4e down in Southbound
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.764 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 systemd[1]: libpod-conmon-9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66.scope: Deactivated successfully.
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.767 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:3e:b3 10.100.0.10'], port_security=['fa:16:3e:ac:3e:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '18ac4790-626b-4d8b-9ba9-34f94dfa7a3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '465bb202-1df3-4c6e-82e9-19a120fe9790', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11cd1bd5-e27d-4fc7-95b6-d09dd95ff43a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f469de0f-e330-4b6b-853b-397301173e4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.793 183134 INFO nova.virt.libvirt.driver [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Instance destroyed successfully.#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.794 183134 DEBUG nova.objects.instance [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:24 np0005601977 podman[220912]: 2026-01-30 09:34:24.805202859 +0000 UTC m=+0.039558454 container remove 9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.810 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1dd135-38f0-43a1-8659-24323a0d9ccb]: (4, ('Fri Jan 30 09:34:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d (9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66)\n9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66\nFri Jan 30 09:34:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d (9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66)\n9ec9f4f3dd8fbf99e7960d2bb483d2436b40e0a3729854e8af67b0db76858b66\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.811 183134 DEBUG nova.virt.libvirt.vif [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889211547',display_name='tempest-TestNetworkBasicOps-server-889211547',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889211547',id=27,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2/VAVW4dKAjJzKmpEVpgLzR2DqOw+zXoSl2UR7CzSDkh0dJUY6mqmCYKJXsB3oSw4pEb6SuJD9dMWrUIvpPRfPo26MvvheukmjkI2wXedigRfILfNn4wlbAs3XZgapng==',key_name='tempest-TestNetworkBasicOps-1660747615',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:32:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-1pbelgld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:32:15Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=18ac4790-626b-4d8b-9ba9-34f94dfa7a3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.811 183134 DEBUG nova.network.os_vif_util [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.812 183134 DEBUG nova.network.os_vif_util [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.812 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[715b5eed-800a-415c-af97-c8064ca52a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.812 183134 DEBUG os_vif [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.813 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap408e9205-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.815 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 kernel: tap408e9205-50: left promiscuous mode
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.817 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf469de0f-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.820 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.821 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.824 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.824 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.825 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.827 183134 INFO os_vif [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:3e:b3,bridge_name='br-int',has_traffic_filtering=True,id=f469de0f-e330-4b6b-853b-397301173e4e,network=Network(408e9205-54bc-4c8e-9fe0-c3c49be6610d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf469de0f-e3')#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.827 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab9227f-7d9f-4915-9895-f0d2ce112697]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.827 183134 INFO nova.virt.libvirt.driver [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Deleting instance files /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f_del#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.828 183134 INFO nova.virt.libvirt.driver [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Deletion of /var/lib/nova/instances/18ac4790-626b-4d8b-9ba9-34f94dfa7a3f_del complete#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.841 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6a255a3e-74fc-412e-b358-0a8e46e9c5a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.842 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[19f64947-138a-4d03-829a-23cf0d93d0bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.854 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c58a90aa-68a4-4692-ba64-44c74aadeb25]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406331, 'reachable_time': 42060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220935, 'error': None, 'target': 'ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.855 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-408e9205-54bc-4c8e-9fe0-c3c49be6610d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.855 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[ae361225-9754-4c3f-974d-aae2f555df08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.856 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f469de0f-e330-4b6b-853b-397301173e4e in datapath 408e9205-54bc-4c8e-9fe0-c3c49be6610d unbound from our chassis#033[00m
Jan 30 04:34:24 np0005601977 systemd[1]: run-netns-ovnmeta\x2d408e9205\x2d54bc\x2d4c8e\x2d9fe0\x2dc3c49be6610d.mount: Deactivated successfully.
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.858 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 408e9205-54bc-4c8e-9fe0-c3c49be6610d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.859 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2262e437-c764-46eb-bad1-f9225ba4f044]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.859 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f469de0f-e330-4b6b-853b-397301173e4e in datapath 408e9205-54bc-4c8e-9fe0-c3c49be6610d unbound from our chassis#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.861 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 408e9205-54bc-4c8e-9fe0-c3c49be6610d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:24.862 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2baecffe-fc70-48c4-936b-16fdbd070d21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.897 183134 INFO nova.compute.manager [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.898 183134 DEBUG oslo.service.loopingcall [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.898 183134 DEBUG nova.compute.manager [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:24 np0005601977 nova_compute[183130]: 2026-01-30 09:34:24.898 183134 DEBUG nova.network.neutron [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:25 np0005601977 nova_compute[183130]: 2026-01-30 09:34:25.276 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765650.2748475, aed146e3-865d-4aee-a055-42ed41e035c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:25 np0005601977 nova_compute[183130]: 2026-01-30 09:34:25.276 183134 INFO nova.compute.manager [-] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:25 np0005601977 nova_compute[183130]: 2026-01-30 09:34:25.306 183134 DEBUG nova.compute.manager [None req-42ed9b74-a9cc-4f6e-85a6-91a80a81db5b - - - - - -] [instance: aed146e3-865d-4aee-a055-42ed41e035c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:25 np0005601977 nova_compute[183130]: 2026-01-30 09:34:25.795 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.641 183134 DEBUG nova.network.neutron [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.664 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-unplugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.665 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.665 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.665 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.666 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-unplugged-f469de0f-e330-4b6b-853b-397301173e4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.666 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-unplugged-f469de0f-e330-4b6b-853b-397301173e4e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.666 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.667 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.667 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.667 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.667 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.668 183134 WARNING nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.668 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.668 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.669 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.669 183134 DEBUG oslo_concurrency.lockutils [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.669 183134 DEBUG nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.670 183134 WARNING nova.compute.manager [req-468c19b4-5531-449f-93d3-48ca514745df req-f5a8b106-3952-44f5-abfc-9fb4018ea6ba dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.683 183134 INFO nova.compute.manager [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Took 1.78 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.747 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.747 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.920 183134 DEBUG nova.compute.provider_tree [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:26 np0005601977 nova_compute[183130]: 2026-01-30 09:34:26.960 183134 DEBUG nova.scheduler.client.report [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.018 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:27Z|00338|binding|INFO|Releasing lport 50e26df2-7d93-4204-9b22-94b2140c0f47 from this chassis (sb_readonly=0)
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.054 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.132 183134 INFO nova.scheduler.client.report [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.250 183134 DEBUG oslo_concurrency.lockutils [None req-51afec66-6423-4798-b348-8e10d9240b63 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.665 183134 DEBUG nova.network.neutron [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updated VIF entry in instance network info cache for port f469de0f-e330-4b6b-853b-397301173e4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.666 183134 DEBUG nova.network.neutron [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Updating instance_info_cache with network_info: [{"id": "f469de0f-e330-4b6b-853b-397301173e4e", "address": "fa:16:3e:ac:3e:b3", "network": {"id": "408e9205-54bc-4c8e-9fe0-c3c49be6610d", "bridge": "br-int", "label": "tempest-network-smoke--1428691147", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf469de0f-e3", "ovs_interfaceid": "f469de0f-e330-4b6b-853b-397301173e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.699 183134 DEBUG oslo_concurrency.lockutils [req-9020e429-6969-45d2-beba-282599990e68 req-13153059-dca1-4c2e-badd-02262e8876c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-18ac4790-626b-4d8b-9ba9-34f94dfa7a3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:27 np0005601977 podman[220937]: 2026-01-30 09:34:27.85707539 +0000 UTC m=+0.060998944 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:34:27 np0005601977 podman[220936]: 2026-01-30 09:34:27.865291507 +0000 UTC m=+0.081045423 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.972 183134 INFO nova.virt.libvirt.driver [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Snapshot image upload complete#033[00m
Jan 30 04:34:27 np0005601977 nova_compute[183130]: 2026-01-30 09:34:27.973 183134 INFO nova.compute.manager [None req-5797d1f7-5789-4f2e-9877-f8b77da7470c 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Took 5.42 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.877 183134 DEBUG nova.compute.manager [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.878 183134 DEBUG oslo_concurrency.lockutils [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.878 183134 DEBUG oslo_concurrency.lockutils [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.879 183134 DEBUG oslo_concurrency.lockutils [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "18ac4790-626b-4d8b-9ba9-34f94dfa7a3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.880 183134 DEBUG nova.compute.manager [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] No waiting events found dispatching network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.880 183134 WARNING nova.compute.manager [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received unexpected event network-vif-plugged-f469de0f-e330-4b6b-853b-397301173e4e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:28 np0005601977 nova_compute[183130]: 2026-01-30 09:34:28.881 183134 DEBUG nova.compute.manager [req-076ad181-11ed-4e49-ab1b-4c09da9b4154 req-5a0659f3-a7ba-4f35-813e-352b4a139c94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Received event network-vif-deleted-f469de0f-e330-4b6b-853b-397301173e4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:29 np0005601977 nova_compute[183130]: 2026-01-30 09:34:29.819 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.576 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.577 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.577 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.578 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.578 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.579 183134 INFO nova.compute.manager [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Terminating instance#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.581 183134 DEBUG nova.compute.manager [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.593 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765655.5927331, 65c66677-23b6-479a-863f-3dd277183a7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.594 183134 INFO nova.compute.manager [-] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:30 np0005601977 kernel: tap4ebb9e31-70 (unregistering): left promiscuous mode
Jan 30 04:34:30 np0005601977 NetworkManager[55565]: <info>  [1769765670.6162] device (tap4ebb9e31-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:30Z|00339|binding|INFO|Releasing lport 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 from this chassis (sb_readonly=0)
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.620 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:30Z|00340|binding|INFO|Setting lport 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 down in Southbound
Jan 30 04:34:30 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:30Z|00341|binding|INFO|Removing iface tap4ebb9e31-70 ovn-installed in OVS
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.630 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:a5:03 10.100.0.8'], port_security=['fa:16:3e:52:a5:03 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6a7e9f4f-a651-4817-a679-b45828fcf5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8980838c-37f7-45e5-9084-1321907354d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc00530c-da00-4b1f-8544-f4f16829e051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4c33c50-4f9e-4c9d-ac8f-b1ee6c0d33bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4ebb9e31-7061-4ecc-9cbf-98143a8361e4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.631 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 in datapath 8980838c-37f7-45e5-9084-1321907354d2 unbound from our chassis#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.632 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8980838c-37f7-45e5-9084-1321907354d2#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.633 183134 DEBUG nova.compute.manager [None req-3407c8e5-ab10-4da0-bfa1-8609ad977083 - - - - - -] [instance: 65c66677-23b6-479a-863f-3dd277183a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.635 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.646 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bd879795-d078-4e14-9d46-96738bf898cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.666 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[05603522-9358-4749-8ef6-8092a7e47672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.669 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[934f4771-46c8-417e-8467-4af1ba7c9890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 30 04:34:30 np0005601977 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 13.676s CPU time.
Jan 30 04:34:30 np0005601977 systemd-machined[154431]: Machine qemu-25-instance-00000020 terminated.
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.687 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb5f37e-49c0-490e-ad97-e02457f2d961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.705 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0b1cd2-8555-412f-8845-4d615359e7d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8980838c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:7d:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411642, 'reachable_time': 33905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220992, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.717 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[eda5c42b-b980-4c6f-accb-849fe800eb77]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8980838c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411649, 'tstamp': 411649}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220993, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8980838c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411650, 'tstamp': 411650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220993, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.719 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8980838c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.721 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.724 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.724 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8980838c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.725 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.725 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8980838c-30, col_values=(('external_ids', {'iface-id': '50e26df2-7d93-4204-9b22-94b2140c0f47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:30 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:30.725 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.797 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.831 183134 INFO nova.virt.libvirt.driver [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Instance destroyed successfully.#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.831 183134 DEBUG nova.objects.instance [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'resources' on Instance uuid 6a7e9f4f-a651-4817-a679-b45828fcf5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.850 183134 DEBUG nova.virt.libvirt.vif [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-85653886',display_name='tempest-TestSnapshotPattern-server-85653886',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-85653886',id=32,image_ref='a6e939c4-3cd5-464f-b227-1809e53fe850',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-y5fo9o9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='8aafaddd-1368-427e-8596-2b5871053f79',image_min_disk='1',image_min_ram='0',image_owner_id='8960c51c5e7f4c65928b539d6bd01b08',image_owner_project_name='tempest-TestSnapshotPattern-1319331586',image_owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member',image_user_id='7701defc672143599a29756b7b25b4dc',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:34:28Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=6a7e9f4f-a651-4817-a679-b45828fcf5af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.850 183134 DEBUG nova.network.os_vif_util [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.851 183134 DEBUG nova.network.os_vif_util [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.851 183134 DEBUG os_vif [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.852 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.852 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ebb9e31-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.854 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.855 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.857 183134 INFO os_vif [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:a5:03,bridge_name='br-int',has_traffic_filtering=True,id=4ebb9e31-7061-4ecc-9cbf-98143a8361e4,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ebb9e31-70')#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.857 183134 INFO nova.virt.libvirt.driver [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Deleting instance files /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af_del#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.857 183134 INFO nova.virt.libvirt.driver [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Deletion of /var/lib/nova/instances/6a7e9f4f-a651-4817-a679-b45828fcf5af_del complete#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.874 183134 DEBUG nova.compute.manager [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-unplugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.874 183134 DEBUG oslo_concurrency.lockutils [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.874 183134 DEBUG oslo_concurrency.lockutils [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.875 183134 DEBUG oslo_concurrency.lockutils [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.875 183134 DEBUG nova.compute.manager [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] No waiting events found dispatching network-vif-unplugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.875 183134 DEBUG nova.compute.manager [req-21239746-67dd-4d2a-b893-152e819f1091 req-81421936-6dbf-416f-b0e6-86c71de1a3b5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-unplugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.927 183134 INFO nova.compute.manager [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.928 183134 DEBUG oslo.service.loopingcall [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.928 183134 DEBUG nova.compute.manager [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.928 183134 DEBUG nova.network.neutron [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.989 183134 DEBUG nova.compute.manager [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.990 183134 DEBUG nova.compute.manager [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing instance network info cache due to event network-changed-4ebb9e31-7061-4ecc-9cbf-98143a8361e4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.990 183134 DEBUG oslo_concurrency.lockutils [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.990 183134 DEBUG oslo_concurrency.lockutils [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:30 np0005601977 nova_compute[183130]: 2026-01-30 09:34:30.991 183134 DEBUG nova.network.neutron [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Refreshing network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:34:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:32Z|00342|binding|INFO|Releasing lport 50e26df2-7d93-4204-9b22-94b2140c0f47 from this chassis (sb_readonly=0)
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.054 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.696 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765657.6943445, 0e693c72-183a-4005-8891-207b95ad22b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.696 183134 INFO nova.compute.manager [-] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.717 183134 DEBUG nova.compute.manager [None req-37307993-5c0c-49ce-99c1-1fc5881bea88 - - - - - -] [instance: 0e693c72-183a-4005-8891-207b95ad22b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:32 np0005601977 podman[221012]: 2026-01-30 09:34:32.937210641 +0000 UTC m=+0.146098892 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.986 183134 DEBUG nova.compute.manager [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.987 183134 DEBUG oslo_concurrency.lockutils [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.987 183134 DEBUG oslo_concurrency.lockutils [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.988 183134 DEBUG oslo_concurrency.lockutils [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.988 183134 DEBUG nova.compute.manager [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] No waiting events found dispatching network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:32 np0005601977 nova_compute[183130]: 2026-01-30 09:34:32.988 183134 WARNING nova.compute.manager [req-45e9438e-5465-48ce-ba7d-9cc6d17ac34f req-69dae5ea-b6a7-4382-bb94-d5e431bd55aa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received unexpected event network-vif-plugged-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.081 183134 DEBUG nova.network.neutron [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.102 183134 INFO nova.compute.manager [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Took 2.17 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.153 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.154 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.255 183134 DEBUG nova.compute.provider_tree [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.277 183134 DEBUG nova.scheduler.client.report [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.343 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.379 183134 INFO nova.scheduler.client.report [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Deleted allocations for instance 6a7e9f4f-a651-4817-a679-b45828fcf5af#033[00m
Jan 30 04:34:33 np0005601977 nova_compute[183130]: 2026-01-30 09:34:33.555 183134 DEBUG oslo_concurrency.lockutils [None req-17585b88-6b59-4ff5-ad04-7a76fe2243d6 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "6a7e9f4f-a651-4817-a679-b45828fcf5af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:34 np0005601977 nova_compute[183130]: 2026-01-30 09:34:34.123 183134 DEBUG nova.network.neutron [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updated VIF entry in instance network info cache for port 4ebb9e31-7061-4ecc-9cbf-98143a8361e4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:34:34 np0005601977 nova_compute[183130]: 2026-01-30 09:34:34.124 183134 DEBUG nova.network.neutron [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Updating instance_info_cache with network_info: [{"id": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "address": "fa:16:3e:52:a5:03", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ebb9e31-70", "ovs_interfaceid": "4ebb9e31-7061-4ecc-9cbf-98143a8361e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:34 np0005601977 nova_compute[183130]: 2026-01-30 09:34:34.167 183134 DEBUG oslo_concurrency.lockutils [req-f5c0c62f-187d-44dc-8ccb-5f266f06236a req-faba2456-d310-419e-8e9e-03470fa4ed4e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-6a7e9f4f-a651-4817-a679-b45828fcf5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:35 np0005601977 nova_compute[183130]: 2026-01-30 09:34:35.184 183134 DEBUG nova.compute.manager [req-48a53513-55f0-4399-98cc-b0c1789e12ad req-a2cf7502-423d-4b3f-8af4-42f0d38c0945 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Received event network-vif-deleted-4ebb9e31-7061-4ecc-9cbf-98143a8361e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:35 np0005601977 nova_compute[183130]: 2026-01-30 09:34:35.803 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:35 np0005601977 nova_compute[183130]: 2026-01-30 09:34:35.854 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:36 np0005601977 nova_compute[183130]: 2026-01-30 09:34:36.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.081 183134 DEBUG nova.compute.manager [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.081 183134 DEBUG nova.compute.manager [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing instance network info cache due to event network-changed-c0d4f325-5a98-4a02-aa86-34097b369c03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.081 183134 DEBUG oslo_concurrency.lockutils [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.082 183134 DEBUG oslo_concurrency.lockutils [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.082 183134 DEBUG nova.network.neutron [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Refreshing network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.117 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.118 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.118 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.118 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.119 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.120 183134 INFO nova.compute.manager [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Terminating instance#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.122 183134 DEBUG nova.compute.manager [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:34:37 np0005601977 kernel: tapc0d4f325-5a (unregistering): left promiscuous mode
Jan 30 04:34:37 np0005601977 NetworkManager[55565]: <info>  [1769765677.1594] device (tapc0d4f325-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:34:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:37Z|00343|binding|INFO|Releasing lport c0d4f325-5a98-4a02-aa86-34097b369c03 from this chassis (sb_readonly=0)
Jan 30 04:34:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:37Z|00344|binding|INFO|Setting lport c0d4f325-5a98-4a02-aa86-34097b369c03 down in Southbound
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.165 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:34:37Z|00345|binding|INFO|Removing iface tapc0d4f325-5a ovn-installed in OVS
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.166 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.175 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:bd:f9 10.100.0.14'], port_security=['fa:16:3e:8b:bd:f9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8aafaddd-1368-427e-8596-2b5871053f79', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8980838c-37f7-45e5-9084-1321907354d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8960c51c5e7f4c65928b539d6bd01b08', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc00530c-da00-4b1f-8544-f4f16829e051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4c33c50-4f9e-4c9d-ac8f-b1ee6c0d33bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=c0d4f325-5a98-4a02-aa86-34097b369c03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.176 104706 INFO neutron.agent.ovn.metadata.agent [-] Port c0d4f325-5a98-4a02-aa86-34097b369c03 in datapath 8980838c-37f7-45e5-9084-1321907354d2 unbound from our chassis#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.176 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.177 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8980838c-37f7-45e5-9084-1321907354d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.179 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aa7aa831-3afc-4ea1-9f48-7846bbabbd7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.179 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8980838c-37f7-45e5-9084-1321907354d2 namespace which is not needed anymore#033[00m
Jan 30 04:34:37 np0005601977 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 30 04:34:37 np0005601977 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 15.245s CPU time.
Jan 30 04:34:37 np0005601977 systemd-machined[154431]: Machine qemu-22-instance-0000001d terminated.
Jan 30 04:34:37 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [NOTICE]   (219431) : haproxy version is 2.8.14-c23fe91
Jan 30 04:34:37 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [NOTICE]   (219431) : path to executable is /usr/sbin/haproxy
Jan 30 04:34:37 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [WARNING]  (219431) : Exiting Master process...
Jan 30 04:34:37 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [ALERT]    (219431) : Current worker (219433) exited with code 143 (Terminated)
Jan 30 04:34:37 np0005601977 neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2[219427]: [WARNING]  (219431) : All workers exited. Exiting... (0)
Jan 30 04:34:37 np0005601977 systemd[1]: libpod-9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139.scope: Deactivated successfully.
Jan 30 04:34:37 np0005601977 podman[221064]: 2026-01-30 09:34:37.283047834 +0000 UTC m=+0.040579884 container died 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:34:37 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139-userdata-shm.mount: Deactivated successfully.
Jan 30 04:34:37 np0005601977 systemd[1]: var-lib-containers-storage-overlay-2c7d1b2d0e7c17734df5faab5bff318464acf24aec429435e68e8127b660fe37-merged.mount: Deactivated successfully.
Jan 30 04:34:37 np0005601977 podman[221064]: 2026-01-30 09:34:37.324691677 +0000 UTC m=+0.082223727 container cleanup 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:34:37 np0005601977 systemd[1]: libpod-conmon-9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139.scope: Deactivated successfully.
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.343 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.345 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.374 183134 INFO nova.virt.libvirt.driver [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Instance destroyed successfully.#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.375 183134 DEBUG nova.objects.instance [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lazy-loading 'resources' on Instance uuid 8aafaddd-1368-427e-8596-2b5871053f79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:37 np0005601977 podman[221092]: 2026-01-30 09:34:37.380686225 +0000 UTC m=+0.040317206 container remove 9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.385 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[80677a0d-28bd-4596-9ace-910015594e1a]: (4, ('Fri Jan 30 09:34:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2 (9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139)\n9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139\nFri Jan 30 09:34:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8980838c-37f7-45e5-9084-1321907354d2 (9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139)\n9a916f0484582e431f04d94bcae1432afdd5dee84ceafa128192cecc8cd01139\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.387 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dce72ded-295e-439d-ab67-b400775238d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.388 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8980838c-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.390 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 kernel: tap8980838c-30: left promiscuous mode
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.395 183134 DEBUG nova.virt.libvirt.vif [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1086942880',display_name='tempest-TestSnapshotPattern-server-1086942880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1086942880',id=29,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBID/K14fCFZHI5JWVWJUAGxCxMra4dWigp369J3sRUJzCc186c+CfXnLX8j6/t+x/1d86id47fcfCswYvS7jgUYMt+7CnhVorESuhnLGDpdEvTT2EjLPSXUofPGYaVdusg==',key_name='tempest-TestSnapshotPattern-1959333507',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:33:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8960c51c5e7f4c65928b539d6bd01b08',ramdisk_id='',reservation_id='r-izyluhvh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1319331586',owner_user_name='tempest-TestSnapshotPattern-1319331586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:33:32Z,user_data=None,user_id='7701defc672143599a29756b7b25b4dc',uuid=8aafaddd-1368-427e-8596-2b5871053f79,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.396 183134 DEBUG nova.network.os_vif_util [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converting VIF {"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.397 183134 DEBUG nova.network.os_vif_util [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.398 183134 DEBUG os_vif [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.400 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.400 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0d4f325-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.401 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.401 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.401 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a24e9dbc-4e73-4821-ae4d-934dd9bf9b16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.403 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.405 183134 INFO os_vif [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:bd:f9,bridge_name='br-int',has_traffic_filtering=True,id=c0d4f325-5a98-4a02-aa86-34097b369c03,network=Network(8980838c-37f7-45e5-9084-1321907354d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0d4f325-5a')#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.405 183134 INFO nova.virt.libvirt.driver [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Deleting instance files /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79_del#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.406 183134 INFO nova.virt.libvirt.driver [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Deletion of /var/lib/nova/instances/8aafaddd-1368-427e-8596-2b5871053f79_del complete#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.424 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6126f975-da08-4ffc-b078-a8d4d1db578a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.425 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d747ae65-ea45-4465-afec-794fd8ad6da4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.444 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0670e23a-2ae6-4ca5-b519-e9cc9ccedb5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411637, 'reachable_time': 42838, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221125, 'error': None, 'target': 'ovnmeta-8980838c-37f7-45e5-9084-1321907354d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.448 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8980838c-37f7-45e5-9084-1321907354d2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:34:37 np0005601977 systemd[1]: run-netns-ovnmeta\x2d8980838c\x2d37f7\x2d45e5\x2d9084\x2d1321907354d2.mount: Deactivated successfully.
Jan 30 04:34:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:37.448 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[712bfcea-ea23-4e12-ac52-18c6de1c4cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.479 183134 INFO nova.compute.manager [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.480 183134 DEBUG oslo.service.loopingcall [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.480 183134 DEBUG nova.compute.manager [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:34:37 np0005601977 nova_compute[183130]: 2026-01-30 09:34:37.480 183134 DEBUG nova.network.neutron [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.041 183134 DEBUG nova.network.neutron [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.062 183134 INFO nova.compute.manager [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Took 0.58 seconds to deallocate network for instance.#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.119 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.120 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.126 183134 DEBUG nova.compute.manager [req-8ce9512b-08f0-49f0-be40-8069968ac130 req-125967cf-24e5-40af-ad49-1624b1c66ee7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-vif-deleted-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.201 183134 DEBUG nova.compute.provider_tree [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.241 183134 DEBUG nova.scheduler.client.report [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.267 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.289 183134 INFO nova.scheduler.client.report [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Deleted allocations for instance 8aafaddd-1368-427e-8596-2b5871053f79#033[00m
Jan 30 04:34:38 np0005601977 nova_compute[183130]: 2026-01-30 09:34:38.373 183134 DEBUG oslo_concurrency.lockutils [None req-2566e839-5d14-4eb5-bb01-4615be99ffdc 7701defc672143599a29756b7b25b4dc 8960c51c5e7f4c65928b539d6bd01b08 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.223 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.229 183134 DEBUG nova.compute.manager [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.230 183134 DEBUG oslo_concurrency.lockutils [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "8aafaddd-1368-427e-8596-2b5871053f79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.230 183134 DEBUG oslo_concurrency.lockutils [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.230 183134 DEBUG oslo_concurrency.lockutils [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "8aafaddd-1368-427e-8596-2b5871053f79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.230 183134 DEBUG nova.compute.manager [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] No waiting events found dispatching network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.231 183134 WARNING nova.compute.manager [req-3994beb8-18de-4000-8f0e-5c87fa2ace68 req-a1c38c00-a00b-4ec7-8b3f-544f75f4d442 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Received unexpected event network-vif-plugged-c0d4f325-5a98-4a02-aa86-34097b369c03 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.792 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765664.786699, 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.792 183134 INFO nova.compute.manager [-] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:39 np0005601977 nova_compute[183130]: 2026-01-30 09:34:39.816 183134 DEBUG nova.compute.manager [None req-3e0cf7f1-157e-4c4f-83db-5a54594c5c62 - - - - - -] [instance: 18ac4790-626b-4d8b-9ba9-34f94dfa7a3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:40 np0005601977 nova_compute[183130]: 2026-01-30 09:34:40.257 183134 DEBUG nova.network.neutron [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updated VIF entry in instance network info cache for port c0d4f325-5a98-4a02-aa86-34097b369c03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:34:40 np0005601977 nova_compute[183130]: 2026-01-30 09:34:40.259 183134 DEBUG nova.network.neutron [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Updating instance_info_cache with network_info: [{"id": "c0d4f325-5a98-4a02-aa86-34097b369c03", "address": "fa:16:3e:8b:bd:f9", "network": {"id": "8980838c-37f7-45e5-9084-1321907354d2", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1563042041-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8960c51c5e7f4c65928b539d6bd01b08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0d4f325-5a", "ovs_interfaceid": "c0d4f325-5a98-4a02-aa86-34097b369c03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:34:40 np0005601977 nova_compute[183130]: 2026-01-30 09:34:40.330 183134 DEBUG oslo_concurrency.lockutils [req-389b5af8-2d76-49be-a02a-2faf7f67a4d5 req-88e4c319-9bb0-44b0-83f9-1e3169292302 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-8aafaddd-1368-427e-8596-2b5871053f79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:34:40 np0005601977 nova_compute[183130]: 2026-01-30 09:34:40.804 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:40 np0005601977 podman[221126]: 2026-01-30 09:34:40.821411223 +0000 UTC m=+0.041063678 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:34:42 np0005601977 nova_compute[183130]: 2026-01-30 09:34:42.401 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:44 np0005601977 nova_compute[183130]: 2026-01-30 09:34:44.821 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:44.821 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:34:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:44.823 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:34:45 np0005601977 nova_compute[183130]: 2026-01-30 09:34:45.805 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:45 np0005601977 nova_compute[183130]: 2026-01-30 09:34:45.830 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765670.8294358, 6a7e9f4f-a651-4817-a679-b45828fcf5af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:45 np0005601977 nova_compute[183130]: 2026-01-30 09:34:45.831 183134 INFO nova.compute.manager [-] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:45 np0005601977 nova_compute[183130]: 2026-01-30 09:34:45.853 183134 DEBUG nova.compute.manager [None req-4f4baafd-e361-4045-b063-27e537b2eec1 - - - - - -] [instance: 6a7e9f4f-a651-4817-a679-b45828fcf5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:47 np0005601977 nova_compute[183130]: 2026-01-30 09:34:47.403 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:49 np0005601977 nova_compute[183130]: 2026-01-30 09:34:49.716 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:50 np0005601977 nova_compute[183130]: 2026-01-30 09:34:50.144 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:50 np0005601977 nova_compute[183130]: 2026-01-30 09:34:50.808 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:50 np0005601977 podman[221150]: 2026-01-30 09:34:50.841856664 +0000 UTC m=+0.056250607 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64)
Jan 30 04:34:50 np0005601977 podman[221151]: 2026-01-30 09:34:50.841954627 +0000 UTC m=+0.058578294 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true)
Jan 30 04:34:51 np0005601977 nova_compute[183130]: 2026-01-30 09:34:51.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:51 np0005601977 nova_compute[183130]: 2026-01-30 09:34:51.224 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:52 np0005601977 nova_compute[183130]: 2026-01-30 09:34:52.372 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765677.3709161, 8aafaddd-1368-427e-8596-2b5871053f79 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:34:52 np0005601977 nova_compute[183130]: 2026-01-30 09:34:52.372 183134 INFO nova.compute.manager [-] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:34:52 np0005601977 nova_compute[183130]: 2026-01-30 09:34:52.393 183134 DEBUG nova.compute.manager [None req-f8815b92-eb98-4a61-90c9-e031b0bebfdc - - - - - -] [instance: 8aafaddd-1368-427e-8596-2b5871053f79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:34:52 np0005601977 nova_compute[183130]: 2026-01-30 09:34:52.405 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:54 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:54.826 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.486 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.487 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.511 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.585 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.586 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.599 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.599 183134 INFO nova.compute.claims [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.737 183134 DEBUG nova.compute.provider_tree [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.797 183134 DEBUG nova.scheduler.client.report [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.809 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.907 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:55 np0005601977 nova_compute[183130]: 2026-01-30 09:34:55.908 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.330 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.330 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.373 183134 INFO nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.413 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.531 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.533 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.534 183134 INFO nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Creating image(s)#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.535 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.535 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.536 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.554 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.613 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.614 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.614 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.629 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.673 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.674 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.699 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.700 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.700 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.760 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.762 183134 DEBUG nova.virt.disk.api [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.762 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.839 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.841 183134 DEBUG nova.virt.disk.api [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.841 183134 DEBUG nova.objects.instance [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.870 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.871 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Ensure instance console log exists: /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.872 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.872 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:56 np0005601977 nova_compute[183130]: 2026-01-30 09:34:56.873 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:57 np0005601977 nova_compute[183130]: 2026-01-30 09:34:57.031 183134 DEBUG nova.policy [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:34:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:57.386 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:57.387 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:34:57.387 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:57 np0005601977 nova_compute[183130]: 2026-01-30 09:34:57.428 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:34:58 np0005601977 podman[221207]: 2026-01-30 09:34:58.836421974 +0000 UTC m=+0.046612208 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:34:58 np0005601977 podman[221206]: 2026-01-30 09:34:58.841992605 +0000 UTC m=+0.054319231 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:34:59 np0005601977 nova_compute[183130]: 2026-01-30 09:34:59.009 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Successfully created port: a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.626 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Successfully updated port: a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.642 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.643 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.643 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.840 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:00 np0005601977 nova_compute[183130]: 2026-01-30 09:35:00.851 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:35:01 np0005601977 nova_compute[183130]: 2026-01-30 09:35:01.178 183134 DEBUG nova.compute.manager [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:01 np0005601977 nova_compute[183130]: 2026-01-30 09:35:01.179 183134 DEBUG nova.compute.manager [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing instance network info cache due to event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:35:01 np0005601977 nova_compute[183130]: 2026-01-30 09:35:01.179 183134 DEBUG oslo_concurrency.lockutils [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:01 np0005601977 nova_compute[183130]: 2026-01-30 09:35:01.987 183134 DEBUG nova.network.neutron [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.013 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.014 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance network_info: |[{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.014 183134 DEBUG oslo_concurrency.lockutils [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.015 183134 DEBUG nova.network.neutron [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.017 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start _get_guest_xml network_info=[{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.021 183134 WARNING nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.029 183134 DEBUG nova.virt.libvirt.host [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.030 183134 DEBUG nova.virt.libvirt.host [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.035 183134 DEBUG nova.virt.libvirt.host [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.036 183134 DEBUG nova.virt.libvirt.host [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.037 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.037 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.038 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.038 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.038 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.038 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.039 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.039 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.039 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.039 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.039 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.040 183134 DEBUG nova.virt.hardware [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.043 183134 DEBUG nova.virt.libvirt.vif [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:34:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.043 183134 DEBUG nova.network.os_vif_util [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.044 183134 DEBUG nova.network.os_vif_util [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.045 183134 DEBUG nova.objects.instance [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.060 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <uuid>119ec1c9-9292-45e9-ae02-0750dc595ccc</uuid>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <name>instance-00000022</name>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1374074683</nova:name>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:35:02</nova:creationTime>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        <nova:port uuid="a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="serial">119ec1c9-9292-45e9-ae02-0750dc595ccc</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="uuid">119ec1c9-9292-45e9-ae02-0750dc595ccc</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:70:a6:fc"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <target dev="tapa6bc941a-f4"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/console.log" append="off"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:35:02 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:35:02 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:35:02 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:35:02 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.060 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Preparing to wait for external event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.061 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.061 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.061 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.062 183134 DEBUG nova.virt.libvirt.vif [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:34:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.062 183134 DEBUG nova.network.os_vif_util [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.062 183134 DEBUG nova.network.os_vif_util [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.063 183134 DEBUG os_vif [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.063 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.064 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.064 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.066 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6bc941a-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.067 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6bc941a-f4, col_values=(('external_ids', {'iface-id': 'a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:a6:fc', 'vm-uuid': '119ec1c9-9292-45e9-ae02-0750dc595ccc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.068 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.0695] manager: (tapa6bc941a-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.070 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.072 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.073 183134 INFO os_vif [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4')#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.124 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.124 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.124 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:70:a6:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.125 183134 INFO nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Using config drive#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.656 183134 INFO nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Creating config drive at /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.660 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptg0tudkn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.776 183134 DEBUG oslo_concurrency.processutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptg0tudkn" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:02 np0005601977 kernel: tapa6bc941a-f4: entered promiscuous mode
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.8330] manager: (tapa6bc941a-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.834 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:02Z|00346|binding|INFO|Claiming lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for this chassis.
Jan 30 04:35:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:02Z|00347|binding|INFO|a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17: Claiming fa:16:3e:70:a6:fc 10.100.0.12
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.844 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.859 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:a6:fc 10.100.0.12'], port_security=['fa:16:3e:70:a6:fc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8954061-53dd-4ae5-b1f3-551811d4d932', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '10c9590a-9772-4136-9d98-7fa939664eb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eeedb5f-c60c-41d6-af1d-b2452d4a440d, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.861 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 in datapath b8954061-53dd-4ae5-b1f3-551811d4d932 bound to our chassis#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.862 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8954061-53dd-4ae5-b1f3-551811d4d932#033[00m
Jan 30 04:35:02 np0005601977 systemd-udevd[221269]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:35:02 np0005601977 systemd-machined[154431]: New machine qemu-28-instance-00000022.
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.874 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[56ee87a1-81b7-46d5-a387-1a89bed342df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.875 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8954061-51 in ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.876 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8954061-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.876 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2416f6fb-d8a5-4392-8e1e-1312ae79a4ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.877 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9f9ed0-024c-475d-8796-9f79119341f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.8826] device (tapa6bc941a-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.8839] device (tapa6bc941a-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:35:02 np0005601977 systemd[1]: Started Virtual Machine qemu-28-instance-00000022.
Jan 30 04:35:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:02Z|00348|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 ovn-installed in OVS
Jan 30 04:35:02 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:02Z|00349|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 up in Southbound
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.885 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[1700adc2-9062-4ea6-a4eb-0c64ae66dd35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 nova_compute[183130]: 2026-01-30 09:35:02.886 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.907 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9df444-3270-470e-8481-6b612844bf91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.928 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[502b6419-27bc-4ff9-beb5-7b281934eda3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.9339] manager: (tapb8954061-50): new Veth device (/org/freedesktop/NetworkManager/Devices/135)
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.933 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a78c736c-198e-48b9-b369-08983b99e795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.966 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a665c3-74fb-41b2-842b-d95c108e53b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.970 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0d05c627-4b3d-487f-8bdb-ad46762ad953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:02 np0005601977 NetworkManager[55565]: <info>  [1769765702.9892] device (tapb8954061-50): carrier: link connected
Jan 30 04:35:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:02.993 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc630b1-104a-4d13-9e69-97724e4c8d48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.006 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[34b338b9-3774-4762-b100-4af3a28d12a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8954061-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423432, 'reachable_time': 29099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221312, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.017 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1c09a56e-b411-493c-9b54-b8125addc0b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:b365'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423432, 'tstamp': 423432}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221317, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.030 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[140afa7e-613b-4d02-91cf-f23fd2b1ec01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8954061-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423432, 'reachable_time': 29099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221321, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.069 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3cbd4f11-6b01-44ce-80ed-5183ec5163a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 podman[221293]: 2026-01-30 09:35:03.091558805 +0000 UTC m=+0.118622209 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.113 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[24ff8d5e-b16f-457f-88e3-731cebc6a17e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.114 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8954061-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.115 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.116 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8954061-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.118 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:03 np0005601977 kernel: tapb8954061-50: entered promiscuous mode
Jan 30 04:35:03 np0005601977 NetworkManager[55565]: <info>  [1769765703.1222] manager: (tapb8954061-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.122 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.124 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8954061-50, col_values=(('external_ids', {'iface-id': '8090f739-5047-4897-b693-95848bd03441'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.125 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:03 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:03Z|00350|binding|INFO|Releasing lport 8090f739-5047-4897-b693-95848bd03441 from this chassis (sb_readonly=0)
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.131 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.132 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.133 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[85a1e94a-91a0-4696-ab99-7f01a7053d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.134 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b8954061-53dd-4ae5-b1f3-551811d4d932
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b8954061-53dd-4ae5-b1f3-551811d4d932
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:35:03 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:03.135 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'env', 'PROCESS_TAG=haproxy-b8954061-53dd-4ae5-b1f3-551811d4d932', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8954061-53dd-4ae5-b1f3-551811d4d932.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.367 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.368 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.369 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.369 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.412 183134 DEBUG nova.compute.manager [req-aaf287e7-7a41-47b6-b16f-7dfbb47b8c6e req-4f4397e7-7dc8-4d1a-b84e-2375fe179f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.416 183134 DEBUG oslo_concurrency.lockutils [req-aaf287e7-7a41-47b6-b16f-7dfbb47b8c6e req-4f4397e7-7dc8-4d1a-b84e-2375fe179f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.417 183134 DEBUG oslo_concurrency.lockutils [req-aaf287e7-7a41-47b6-b16f-7dfbb47b8c6e req-4f4397e7-7dc8-4d1a-b84e-2375fe179f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.418 183134 DEBUG oslo_concurrency.lockutils [req-aaf287e7-7a41-47b6-b16f-7dfbb47b8c6e req-4f4397e7-7dc8-4d1a-b84e-2375fe179f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.418 183134 DEBUG nova.compute.manager [req-aaf287e7-7a41-47b6-b16f-7dfbb47b8c6e req-4f4397e7-7dc8-4d1a-b84e-2375fe179f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Processing event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.463 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:03 np0005601977 podman[221361]: 2026-01-30 09:35:03.489959638 +0000 UTC m=+0.050246123 container create 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:35:03 np0005601977 systemd[1]: Started libpod-conmon-2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20.scope.
Jan 30 04:35:03 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.530 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.531 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:03 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/568f2da4ce95a86d7a80980ca151026863237e4adacc75e86ab04e75f5fed0f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:35:03 np0005601977 podman[221361]: 2026-01-30 09:35:03.545574685 +0000 UTC m=+0.105861310 container init 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:35:03 np0005601977 podman[221361]: 2026-01-30 09:35:03.550993411 +0000 UTC m=+0.111279896 container start 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:35:03 np0005601977 podman[221361]: 2026-01-30 09:35:03.466080518 +0000 UTC m=+0.026367093 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.575 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:03 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [NOTICE]   (221391) : New worker (221397) forked
Jan 30 04:35:03 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [NOTICE]   (221391) : Loading success.
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.594 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765703.5946, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.595 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Started (Lifecycle Event)#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.598 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.613 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.618 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.619 183134 INFO nova.virt.libvirt.driver [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance spawned successfully.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.619 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.622 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.645 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.645 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.646 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.646 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.647 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.647 183134 DEBUG nova.virt.libvirt.driver [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.651 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.651 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765703.5952375, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.651 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.698 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.701 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765703.6043146, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.701 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.721 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.724 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.729 183134 INFO nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Took 7.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.729 183134 DEBUG nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.745 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.746 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5632MB free_disk=73.24898910522461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.747 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.747 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.771 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.836 183134 INFO nova.compute.manager [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Took 8.28 seconds to build instance.#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.857 183134 DEBUG oslo_concurrency.lockutils [None req-fbb21069-7e1d-4d71-aa7b-a3909fe63edf 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.926 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 119ec1c9-9292-45e9-ae02-0750dc595ccc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.926 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.927 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.982 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:35:03 np0005601977 nova_compute[183130]: 2026-01-30 09:35:03.997 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:35:04 np0005601977 nova_compute[183130]: 2026-01-30 09:35:04.025 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:35:04 np0005601977 nova_compute[183130]: 2026-01-30 09:35:04.025 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:04 np0005601977 nova_compute[183130]: 2026-01-30 09:35:04.968 183134 DEBUG nova.network.neutron [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updated VIF entry in instance network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:35:04 np0005601977 nova_compute[183130]: 2026-01-30 09:35:04.969 183134 DEBUG nova.network.neutron [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:04 np0005601977 nova_compute[183130]: 2026-01-30 09:35:04.995 183134 DEBUG oslo_concurrency.lockutils [req-f9d12513-54c6-42a0-bed8-69d0b7cb06d2 req-3495635d-7c73-49e5-881a-2a9d0fb7e070 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:05 np0005601977 nova_compute[183130]: 2026-01-30 09:35:05.844 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.110 183134 DEBUG nova.compute.manager [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.111 183134 DEBUG oslo_concurrency.lockutils [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.111 183134 DEBUG oslo_concurrency.lockutils [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.112 183134 DEBUG oslo_concurrency.lockutils [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.113 183134 DEBUG nova.compute.manager [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] No waiting events found dispatching network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:35:06 np0005601977 nova_compute[183130]: 2026-01-30 09:35:06.114 183134 WARNING nova.compute.manager [req-c641ba08-08f2-4e98-a8dd-0fca62578eef req-951bf47b-76c8-476c-aefb-d6059810b91f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received unexpected event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:35:07 np0005601977 nova_compute[183130]: 2026-01-30 09:35:07.067 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:09 np0005601977 NetworkManager[55565]: <info>  [1769765709.2486] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 30 04:35:09 np0005601977 NetworkManager[55565]: <info>  [1769765709.2495] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.248 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.272 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:09Z|00351|binding|INFO|Releasing lport 8090f739-5047-4897-b693-95848bd03441 from this chassis (sb_readonly=0)
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.290 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.979 183134 DEBUG nova.compute.manager [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.979 183134 DEBUG nova.compute.manager [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing instance network info cache due to event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.980 183134 DEBUG oslo_concurrency.lockutils [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.980 183134 DEBUG oslo_concurrency.lockutils [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:09 np0005601977 nova_compute[183130]: 2026-01-30 09:35:09.980 183134 DEBUG nova.network.neutron [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:35:10 np0005601977 nova_compute[183130]: 2026-01-30 09:35:10.025 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:10 np0005601977 nova_compute[183130]: 2026-01-30 09:35:10.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:10 np0005601977 nova_compute[183130]: 2026-01-30 09:35:10.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:10 np0005601977 nova_compute[183130]: 2026-01-30 09:35:10.845 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:11 np0005601977 nova_compute[183130]: 2026-01-30 09:35:11.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:11 np0005601977 nova_compute[183130]: 2026-01-30 09:35:11.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:11 np0005601977 nova_compute[183130]: 2026-01-30 09:35:11.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:35:11 np0005601977 podman[221408]: 2026-01-30 09:35:11.829342242 +0000 UTC m=+0.048884063 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.338 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.371 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.970 183134 DEBUG nova.network.neutron [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updated VIF entry in instance network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.971 183134 DEBUG nova.network.neutron [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:12 np0005601977 nova_compute[183130]: 2026-01-30 09:35:12.989 183134 DEBUG oslo_concurrency.lockutils [req-26af3f14-f3c8-45e5-bf12-3cba2473f3d3 req-07102176-aed1-404a-bbaf-24a517475af0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:13 np0005601977 nova_compute[183130]: 2026-01-30 09:35:13.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:13 np0005601977 nova_compute[183130]: 2026-01-30 09:35:13.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:35:13 np0005601977 nova_compute[183130]: 2026-01-30 09:35:13.507 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:35:15 np0005601977 nova_compute[183130]: 2026-01-30 09:35:15.503 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:15Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:70:a6:fc 10.100.0.12
Jan 30 04:35:15 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:15Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:70:a6:fc 10.100.0.12
Jan 30 04:35:15 np0005601977 nova_compute[183130]: 2026-01-30 09:35:15.847 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:17 np0005601977 nova_compute[183130]: 2026-01-30 09:35:17.071 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:20 np0005601977 nova_compute[183130]: 2026-01-30 09:35:20.849 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:21 np0005601977 nova_compute[183130]: 2026-01-30 09:35:21.352 183134 INFO nova.compute.manager [None req-2998bc41-7cd1-45fb-9275-b302ed5581a8 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Get console output#033[00m
Jan 30 04:35:21 np0005601977 nova_compute[183130]: 2026-01-30 09:35:21.359 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:35:21 np0005601977 podman[221446]: 2026-01-30 09:35:21.844064828 +0000 UTC m=+0.057462952 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:35:21 np0005601977 podman[221445]: 2026-01-30 09:35:21.867016541 +0000 UTC m=+0.083495224 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1769056855, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:35:22 np0005601977 nova_compute[183130]: 2026-01-30 09:35:22.073 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:25 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:25Z|00352|binding|INFO|Releasing lport 8090f739-5047-4897-b693-95848bd03441 from this chassis (sb_readonly=0)
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.209 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.370 183134 DEBUG nova.compute.manager [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.491 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.491 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.567 183134 DEBUG nova.objects.instance [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_requests' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.591 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.591 183134 INFO nova.compute.claims [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.592 183134 DEBUG nova.objects.instance [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.621 183134 DEBUG nova.objects.instance [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.744 183134 INFO nova.compute.resource_tracker [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating resource usage from migration 674e6698-7cfd-4565-9c3b-7d7741025274#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.796 183134 DEBUG nova.compute.provider_tree [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.812 183134 DEBUG nova.scheduler.client.report [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.850 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.873 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:25 np0005601977 nova_compute[183130]: 2026-01-30 09:35:25.874 183134 INFO nova.compute.manager [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Migrating#033[00m
Jan 30 04:35:26 np0005601977 nova_compute[183130]: 2026-01-30 09:35:26.058 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:26 np0005601977 nova_compute[183130]: 2026-01-30 09:35:26.058 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:26 np0005601977 nova_compute[183130]: 2026-01-30 09:35:26.059 183134 DEBUG nova.network.neutron [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:35:27 np0005601977 nova_compute[183130]: 2026-01-30 09:35:27.073 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:28 np0005601977 nova_compute[183130]: 2026-01-30 09:35:28.257 183134 DEBUG nova.network.neutron [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:28 np0005601977 nova_compute[183130]: 2026-01-30 09:35:28.363 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:29 np0005601977 nova_compute[183130]: 2026-01-30 09:35:29.093 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 30 04:35:29 np0005601977 nova_compute[183130]: 2026-01-30 09:35:29.096 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 30 04:35:29 np0005601977 podman[221490]: 2026-01-30 09:35:29.851605695 +0000 UTC m=+0.064078583 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:35:29 np0005601977 podman[221491]: 2026-01-30 09:35:29.863292913 +0000 UTC m=+0.073041062 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:35:30 np0005601977 nova_compute[183130]: 2026-01-30 09:35:30.851 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 kernel: tapa6bc941a-f4 (unregistering): left promiscuous mode
Jan 30 04:35:31 np0005601977 NetworkManager[55565]: <info>  [1769765731.2559] device (tapa6bc941a-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:35:31 np0005601977 nova_compute[183130]: 2026-01-30 09:35:31.267 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:31Z|00353|binding|INFO|Releasing lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 from this chassis (sb_readonly=0)
Jan 30 04:35:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:31Z|00354|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 down in Southbound
Jan 30 04:35:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:31Z|00355|binding|INFO|Removing iface tapa6bc941a-f4 ovn-installed in OVS
Jan 30 04:35:31 np0005601977 nova_compute[183130]: 2026-01-30 09:35:31.270 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 nova_compute[183130]: 2026-01-30 09:35:31.277 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 30 04:35:31 np0005601977 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000022.scope: Consumed 12.910s CPU time.
Jan 30 04:35:31 np0005601977 systemd-machined[154431]: Machine qemu-28-instance-00000022 terminated.
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.347 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:a6:fc 10.100.0.12'], port_security=['fa:16:3e:70:a6:fc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8954061-53dd-4ae5-b1f3-551811d4d932', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '10c9590a-9772-4136-9d98-7fa939664eb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eeedb5f-c60c-41d6-af1d-b2452d4a440d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.349 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 in datapath b8954061-53dd-4ae5-b1f3-551811d4d932 unbound from our chassis#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.351 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8954061-53dd-4ae5-b1f3-551811d4d932, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.352 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[98ae3d31-f496-4ece-9864-44383b3eec28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.353 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 namespace which is not needed anymore#033[00m
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [NOTICE]   (221391) : haproxy version is 2.8.14-c23fe91
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [NOTICE]   (221391) : path to executable is /usr/sbin/haproxy
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [WARNING]  (221391) : Exiting Master process...
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [WARNING]  (221391) : Exiting Master process...
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [ALERT]    (221391) : Current worker (221397) exited with code 143 (Terminated)
Jan 30 04:35:31 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221382]: [WARNING]  (221391) : All workers exited. Exiting... (0)
Jan 30 04:35:31 np0005601977 systemd[1]: libpod-2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20.scope: Deactivated successfully.
Jan 30 04:35:31 np0005601977 podman[221557]: 2026-01-30 09:35:31.478360713 +0000 UTC m=+0.048012208 container died 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:35:31 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20-userdata-shm.mount: Deactivated successfully.
Jan 30 04:35:31 np0005601977 systemd[1]: var-lib-containers-storage-overlay-568f2da4ce95a86d7a80980ca151026863237e4adacc75e86ab04e75f5fed0f6-merged.mount: Deactivated successfully.
Jan 30 04:35:31 np0005601977 podman[221557]: 2026-01-30 09:35:31.5204542 +0000 UTC m=+0.090105675 container cleanup 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:35:31 np0005601977 systemd[1]: libpod-conmon-2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20.scope: Deactivated successfully.
Jan 30 04:35:31 np0005601977 podman[221602]: 2026-01-30 09:35:31.572605817 +0000 UTC m=+0.037551346 container remove 2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.576 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[86d649e4-4c8b-4f86-97ed-3b9d89e5c56b]: (4, ('Fri Jan 30 09:35:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 (2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20)\n2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20\nFri Jan 30 09:35:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 (2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20)\n2beb12e898a4350e87b0114bc0abcbb366ce5d0c32f41c2a1da2d4920f13cd20\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.578 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d06bbe58-6e64-41c8-8da2-d81c3d3aa5f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.579 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8954061-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:31 np0005601977 nova_compute[183130]: 2026-01-30 09:35:31.581 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 kernel: tapb8954061-50: left promiscuous mode
Jan 30 04:35:31 np0005601977 nova_compute[183130]: 2026-01-30 09:35:31.588 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.593 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3afc46ed-4617-4b0d-979d-1b7eb8e2e93b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.606 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2aed22-9006-42c0-8a1a-0e810826f5e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.607 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9a921dd7-b332-4a58-b503-d31e1ab6b998]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.627 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[54bc6abd-5311-496e-ada2-2a8078365a5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423426, 'reachable_time': 42175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221621, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.630 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:35:31 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:31.630 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[652d6e43-804f-4ac5-b004-f338050916e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:31 np0005601977 systemd[1]: run-netns-ovnmeta\x2db8954061\x2d53dd\x2d4ae5\x2db1f3\x2d551811d4d932.mount: Deactivated successfully.
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.075 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.110 183134 INFO nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance shutdown successfully after 3 seconds.#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.117 183134 INFO nova.virt.libvirt.driver [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance destroyed successfully.#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.118 183134 DEBUG nova.virt.libvirt.vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:35:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:35:25Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.119 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.120 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.121 183134 DEBUG os_vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.124 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.125 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6bc941a-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.133 183134 INFO os_vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4')#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.137 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.157 183134 DEBUG nova.compute.manager [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-unplugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.158 183134 DEBUG oslo_concurrency.lockutils [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.158 183134 DEBUG oslo_concurrency.lockutils [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.159 183134 DEBUG oslo_concurrency.lockutils [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.159 183134 DEBUG nova.compute.manager [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] No waiting events found dispatching network-vif-unplugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.160 183134 WARNING nova.compute.manager [req-3598aa06-387d-41a2-b94e-7c4d028c278d req-abcb3f05-e9fa-4463-82dd-0ab94a6915a6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received unexpected event network-vif-unplugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.207 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.208 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.284 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.286 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.304 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.304 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk.config /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.317 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk.config /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.318 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk.info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.331 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "cp -r /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_resize/disk.info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:32 np0005601977 nova_compute[183130]: 2026-01-30 09:35:32.981 183134 DEBUG nova.network.neutron [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 30 04:35:33 np0005601977 nova_compute[183130]: 2026-01-30 09:35:33.505 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:33 np0005601977 nova_compute[183130]: 2026-01-30 09:35:33.506 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:33 np0005601977 nova_compute[183130]: 2026-01-30 09:35:33.507 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:33 np0005601977 podman[221631]: 2026-01-30 09:35:33.856054621 +0000 UTC m=+0.076767419 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.066 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.066 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.067 183134 DEBUG nova.network.neutron [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.272 183134 DEBUG nova.compute.manager [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.272 183134 DEBUG oslo_concurrency.lockutils [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.273 183134 DEBUG oslo_concurrency.lockutils [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.273 183134 DEBUG oslo_concurrency.lockutils [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.273 183134 DEBUG nova.compute.manager [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] No waiting events found dispatching network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:35:34 np0005601977 nova_compute[183130]: 2026-01-30 09:35:34.273 183134 WARNING nova.compute.manager [req-1fce41c0-fa0a-459d-895c-ebfffa5128db req-08e88dba-a849-4b5e-b7ee-1e0ba6abd20d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received unexpected event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 30 04:35:35 np0005601977 nova_compute[183130]: 2026-01-30 09:35:35.853 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:35.999 183134 DEBUG nova.network.neutron [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.124 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.358 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.359 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.360 183134 INFO nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Creating image(s)#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.361 183134 DEBUG nova.objects.instance [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.385 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.448 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.449 183134 DEBUG nova.virt.disk.api [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.449 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.510 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.511 183134 DEBUG nova.virt.disk.api [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.591 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.591 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Ensure instance console log exists: /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.592 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.592 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.593 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.595 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start _get_guest_xml network_info=[{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.599 183134 WARNING nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.603 183134 DEBUG nova.virt.libvirt.host [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.604 183134 DEBUG nova.virt.libvirt.host [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.608 183134 DEBUG nova.virt.libvirt.host [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.608 183134 DEBUG nova.virt.libvirt.host [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.609 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.609 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.610 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.610 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.610 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.610 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.611 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.611 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.611 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.611 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.612 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.612 183134 DEBUG nova.virt.hardware [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.612 183134 DEBUG nova.objects.instance [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.829 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.890 183134 DEBUG oslo_concurrency.processutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.890 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.891 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.891 183134 DEBUG oslo_concurrency.lockutils [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.892 183134 DEBUG nova.virt.libvirt.vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:35:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:35:33Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.893 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.893 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.895 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <uuid>119ec1c9-9292-45e9-ae02-0750dc595ccc</uuid>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <name>instance-00000022</name>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <memory>196608</memory>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1374074683</nova:name>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:35:36</nova:creationTime>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.micro">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:memory>192</nova:memory>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        <nova:port uuid="a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="serial">119ec1c9-9292-45e9-ae02-0750dc595ccc</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="uuid">119ec1c9-9292-45e9-ae02-0750dc595ccc</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.config"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:70:a6:fc"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <target dev="tapa6bc941a-f4"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc/console.log" append="off"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:35:36 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:35:36 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:35:36 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:35:36 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.896 183134 DEBUG nova.virt.libvirt.vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:35:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:35:33Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.896 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1522265598", "vif_mac": "fa:16:3e:70:a6:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.897 183134 DEBUG nova.network.os_vif_util [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.897 183134 DEBUG os_vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.898 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.898 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.898 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.900 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.900 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6bc941a-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.901 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6bc941a-f4, col_values=(('external_ids', {'iface-id': 'a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:a6:fc', 'vm-uuid': '119ec1c9-9292-45e9-ae02-0750dc595ccc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.902 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:36 np0005601977 NetworkManager[55565]: <info>  [1769765736.9036] manager: (tapa6bc941a-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.904 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.907 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:36 np0005601977 nova_compute[183130]: 2026-01-30 09:35:36.908 183134 INFO os_vif [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4')#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.185 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.186 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.186 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:70:a6:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.188 183134 INFO nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Using config drive#033[00m
Jan 30 04:35:37 np0005601977 kernel: tapa6bc941a-f4: entered promiscuous mode
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.2378] manager: (tapa6bc941a-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Jan 30 04:35:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:37Z|00356|binding|INFO|Claiming lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for this chassis.
Jan 30 04:35:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:37Z|00357|binding|INFO|a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17: Claiming fa:16:3e:70:a6:fc 10.100.0.12
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.238 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.247 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:37Z|00358|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 ovn-installed in OVS
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.250 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.254 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 systemd-machined[154431]: New machine qemu-29-instance-00000022.
Jan 30 04:35:37 np0005601977 systemd[1]: Started Virtual Machine qemu-29-instance-00000022.
Jan 30 04:35:37 np0005601977 systemd-udevd[221683]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.3157] device (tapa6bc941a-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.3162] device (tapa6bc941a-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.590 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:a6:fc 10.100.0.12'], port_security=['fa:16:3e:70:a6:fc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8954061-53dd-4ae5-b1f3-551811d4d932', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '10c9590a-9772-4136-9d98-7fa939664eb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eeedb5f-c60c-41d6-af1d-b2452d4a440d, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:37Z|00359|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 up in Southbound
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.594 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 in datapath b8954061-53dd-4ae5-b1f3-551811d4d932 bound to our chassis#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.596 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8954061-53dd-4ae5-b1f3-551811d4d932#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.608 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3e273d9b-69b3-4cec-8b71-adaecd4acc68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.610 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8954061-51 in ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.612 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8954061-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.612 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[89c73dca-a71d-4dd5-a795-41ff507548fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.613 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9c304b-624e-44c7-a935-7e020a7acf5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.624 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[106942a5-f9e1-41cf-bbd9-1720c7503825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.639 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cf93153b-cd7f-4f2f-9162-d6881eb89c2b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.661 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[15d9b1f5-1254-4ba2-99fa-88450c823e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.665 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe8ca50-9c55-483b-94a2-28bed17000ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.6666] manager: (tapb8954061-50): new Veth device (/org/freedesktop/NetworkManager/Devices/141)
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.698 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e33a17-dd3a-4cf2-823e-99dd0efa65af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.703 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e686ff5b-b0f5-408c-91e0-bd921f2eb24e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.7237] device (tapb8954061-50): carrier: link connected
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.728 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[fac5b3c6-79a1-40d1-a582-6b2b07adde8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.744 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4c20345e-056d-4e2d-8342-779406bd1e00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8954061-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426906, 'reachable_time': 43827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221723, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.758 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Removed pending event for 119ec1c9-9292-45e9-ae02-0750dc595ccc due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.759 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765737.7577977, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.759 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.761 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad2d2fa-63cd-4d28-9b2d-4d2b42589a36]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:b365'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426906, 'tstamp': 426906}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221724, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.762 183134 DEBUG nova.compute.manager [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.766 183134 INFO nova.virt.libvirt.driver [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance running successfully.#033[00m
Jan 30 04:35:37 np0005601977 virtqemud[182587]: argument unsupported: QEMU guest agent is not configured
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.769 183134 DEBUG nova.virt.libvirt.guest [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.769 183134 DEBUG nova.virt.libvirt.driver [None req-95de0fa5-a73b-4157-af28-becd6229e611 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.772 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8470f618-9391-4856-b1f8-ae87d8234265]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8954061-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426906, 'reachable_time': 43827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221725, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.794 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c31af31a-55ef-4535-a1b5-6f7bcb20f2f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.846 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3a484cc7-7a44-4880-af84-dfbb9548d14b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.851 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8954061-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.852 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.852 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8954061-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.897 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 kernel: tapb8954061-50: entered promiscuous mode
Jan 30 04:35:37 np0005601977 NetworkManager[55565]: <info>  [1769765737.8978] manager: (tapb8954061-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.900 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8954061-50, col_values=(('external_ids', {'iface-id': '8090f739-5047-4897-b693-95848bd03441'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.901 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:37Z|00360|binding|INFO|Releasing lport 8090f739-5047-4897-b693-95848bd03441 from this chassis (sb_readonly=0)
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.905 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.906 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.907 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1659ee-7e3c-4999-a377-134e2e548dca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.908 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b8954061-53dd-4ae5-b1f3-551811d4d932
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b8954061-53dd-4ae5-b1f3-551811d4d932.pid.haproxy
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b8954061-53dd-4ae5-b1f3-551811d4d932
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:35:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:37.910 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'env', 'PROCESS_TAG=haproxy-b8954061-53dd-4ae5-b1f3-551811d4d932', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8954061-53dd-4ae5-b1f3-551811d4d932.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.970 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:37 np0005601977 nova_compute[183130]: 2026-01-30 09:35:37.973 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:35:38 np0005601977 podman[221757]: 2026-01-30 09:35:38.245605607 +0000 UTC m=+0.049672866 container create ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:35:38 np0005601977 systemd[1]: Started libpod-conmon-ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c.scope.
Jan 30 04:35:38 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:35:38 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c568ccea9f2104372bada29075306d1170bc8a70de9485aff7b5f03bd712131/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.317 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 30 04:35:38 np0005601977 podman[221757]: 2026-01-30 09:35:38.222803098 +0000 UTC m=+0.026870337 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.318 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765737.7616875, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.318 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Started (Lifecycle Event)#033[00m
Jan 30 04:35:38 np0005601977 podman[221757]: 2026-01-30 09:35:38.320522662 +0000 UTC m=+0.124589901 container init ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:35:38 np0005601977 podman[221757]: 2026-01-30 09:35:38.327500354 +0000 UTC m=+0.131567583 container start ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:35:38 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [NOTICE]   (221777) : New worker (221779) forked
Jan 30 04:35:38 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [NOTICE]   (221777) : Loading success.
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.546 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.549 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.901 183134 DEBUG nova.compute.manager [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.902 183134 DEBUG oslo_concurrency.lockutils [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.902 183134 DEBUG oslo_concurrency.lockutils [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.903 183134 DEBUG oslo_concurrency.lockutils [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.903 183134 DEBUG nova.compute.manager [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] No waiting events found dispatching network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:35:38 np0005601977 nova_compute[183130]: 2026-01-30 09:35:38.903 183134 WARNING nova.compute.manager [req-c8cbe38d-28e1-4735-99e3-10aae6f336a1 req-51139ec8-1f0f-4eaf-8b07-9979bf513417 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received unexpected event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for instance with vm_state resized and task_state None.#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.410 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.411 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.411 183134 DEBUG nova.compute.manager [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Going to confirm migration 6 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.705 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.705 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.706 183134 DEBUG nova.network.neutron [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.706 183134 DEBUG nova.objects.instance [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'info_cache' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:40 np0005601977 nova_compute[183130]: 2026-01-30 09:35:40.855 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.111 183134 DEBUG nova.compute.manager [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.111 183134 DEBUG oslo_concurrency.lockutils [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.112 183134 DEBUG oslo_concurrency.lockutils [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.112 183134 DEBUG oslo_concurrency.lockutils [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.112 183134 DEBUG nova.compute.manager [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] No waiting events found dispatching network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.112 183134 WARNING nova.compute.manager [req-7592df95-28b6-4670-a841-3a312ef9d08e req-5460af01-4610-43ff-b950-f838e638cfc7 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received unexpected event network-vif-plugged-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 for instance with vm_state resized and task_state None.#033[00m
Jan 30 04:35:41 np0005601977 nova_compute[183130]: 2026-01-30 09:35:41.904 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.758 183134 DEBUG nova.network.neutron [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.811 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.812 183134 DEBUG nova.objects.instance [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:42 np0005601977 podman[221788]: 2026-01-30 09:35:42.836030768 +0000 UTC m=+0.055472384 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.847 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.847 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:42 np0005601977 nova_compute[183130]: 2026-01-30 09:35:42.970 183134 DEBUG nova.compute.provider_tree [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:35:43 np0005601977 nova_compute[183130]: 2026-01-30 09:35:43.008 183134 DEBUG nova.scheduler.client.report [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:35:43 np0005601977 nova_compute[183130]: 2026-01-30 09:35:43.119 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:43 np0005601977 nova_compute[183130]: 2026-01-30 09:35:43.315 183134 INFO nova.scheduler.client.report [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocation for migration 674e6698-7cfd-4565-9c3b-7d7741025274#033[00m
Jan 30 04:35:43 np0005601977 nova_compute[183130]: 2026-01-30 09:35:43.385 183134 DEBUG oslo_concurrency.lockutils [None req-ad23ea78-b5fe-4ef6-8f0f-8bdf735123e4 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:43 np0005601977 nova_compute[183130]: 2026-01-30 09:35:43.827 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:45 np0005601977 nova_compute[183130]: 2026-01-30 09:35:45.858 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:46 np0005601977 nova_compute[183130]: 2026-01-30 09:35:46.922 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:47 np0005601977 nova_compute[183130]: 2026-01-30 09:35:47.941 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:47.941 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:47.943 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:35:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:47.944 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:48Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:70:a6:fc 10.100.0.12
Jan 30 04:35:49 np0005601977 nova_compute[183130]: 2026-01-30 09:35:49.420 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:50 np0005601977 nova_compute[183130]: 2026-01-30 09:35:50.860 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:51 np0005601977 nova_compute[183130]: 2026-01-30 09:35:51.925 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:52 np0005601977 podman[221822]: 2026-01-30 09:35:52.85937314 +0000 UTC m=+0.078150452 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:35:52 np0005601977 podman[221821]: 2026-01-30 09:35:52.876069203 +0000 UTC m=+0.096220655 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.452 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b741677902cdc34801a38b7658be191fd284b6abc977e6eabd1767165f54f9cc" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 30 04:35:55 np0005601977 nova_compute[183130]: 2026-01-30 09:35:55.484 183134 INFO nova.compute.manager [None req-c3f98436-3fd3-4420-a275-ff4aed305cba 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Get console output#033[00m
Jan 30 04:35:55 np0005601977 nova_compute[183130]: 2026-01-30 09:35:55.490 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.626 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 30 Jan 2026 09:35:55 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2d8d280d-49e4-433f-83bf-b52896fb7292 x-openstack-request-id: req-2d8d280d-49e4-433f-83bf-b52896fb7292 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.626 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "43faf4bc-65eb-437f-b3dc-707ebe898840", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/43faf4bc-65eb-437f-b3dc-707ebe898840"}]}, {"id": "bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.626 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-2d8d280d-49e4-433f-83bf-b52896fb7292 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.627 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b741677902cdc34801a38b7658be191fd284b6abc977e6eabd1767165f54f9cc" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.687 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Fri, 30 Jan 2026 09:35:55 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e613ebcf-b3b0-4ffc-a18e-841b1a45b060 x-openstack-request-id: req-e613ebcf-b3b0-4ffc-a18e-841b1a45b060 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.687 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.688 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8 used request id req-e613ebcf-b3b0-4ffc-a18e-841b1a45b060 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.689 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000022', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'hostId': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.695 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 119ec1c9-9292-45e9-ae02-0750dc595ccc / tapa6bc941a-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.695 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e24ca41-929c-4f38-b238-a5ce64ed0341', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.690499', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '138b4eb8-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '8171b182f768af2ff410f525097a4db4c49378a5843f248156d9a63266a0e530'}]}, 'timestamp': '2026-01-30 09:35:55.696613', '_unique_id': 'a8ca9c776c91421187c363eae35509aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.698 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.699 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5116001-d13f-4df1-88d7-cef17729b96d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.699846', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '138be792-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '68625084d934181588c7856c5bea73c432a29c15ff9e162be27ea7326ae39aa6'}]}, 'timestamp': '2026-01-30 09:35:55.700548', '_unique_id': '0429a43ab5594cc7a114a18ebdbb946f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.701 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.703 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d76cc39-dbb5-4b5f-b2fd-16262eb5f292', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.703379', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '138c7112-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': 'd7e8af3377468fc3903818b65fc0464a2958719a9a7f66019fa2f5947ea69a79'}]}, 'timestamp': '2026-01-30 09:35:55.704070', '_unique_id': '79c32a7823c44dbe9ceddb184b06916e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.737 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.requests volume: 1216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.738 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88b43a34-8522-41fa-9c6c-8ec22c482d1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1216, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.707309', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1391a06a-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': '9b204ab2ed5b8b1db9071219dfab7fca7914344957e0e652dd7ae57882f27f7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.707309', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1391bf46-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': '311f70c92fb061f5e79bb370887b236fd708e8d87079e2a73854477cf06d0572'}]}, 'timestamp': '2026-01-30 09:35:55.738780', '_unique_id': '614aa2e871e14ce79597fd19d2328212'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.741 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.outgoing.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51959fdb-c361-41c6-95a6-9a9d03dff4b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.741930', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '139250f0-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': 'f2b802047a25cdf55cc29a66a1071ea9155a543e8a4cea2092ab99841c48fcd0'}]}, 'timestamp': '2026-01-30 09:35:55.742505', '_unique_id': '270b3d92877345f290f63e451f1c36f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.743 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.745 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.bytes volume: 32061440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.745 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42ae37b4-c942-4d72-80f0-7dbfc637b1ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32061440, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.745029', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1392ccb0-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'a694b000cbfc603347e8113192e34d6e5c0769244680c1ae1f3811b21891ff82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.745029', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1392e100-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': '42829c4973732f7497359248b8dc9d7a6ce817d4f174e26b81e38347700b9ae0'}]}, 'timestamp': '2026-01-30 09:35:55.746150', '_unique_id': '9e5fe9c7f2324c088b58f72305b0ffdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.749 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.749 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84d0c7aa-fddb-4965-92c4-b84881abc159', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.749026', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1393647c-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'adf3bbff766d7d911bdef8e85aeeaf472f0dc477c6ef1ad21b426093a378337f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.749026', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1393714c-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'f0fd7bf7afd9451dd5afd08361b1b9dcbe0a856c23b393a59865a2e6f334a3a3'}]}, 'timestamp': '2026-01-30 09:35:55.749782', '_unique_id': 'f3a17ed22f1348eb8e1ad8f87562d256'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.762 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.763 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '136e821a-c580-4622-a4bd-995c201e9570', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.751779', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '13958a9a-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': '4cab0a225bfb30ac08817b55572cea297ac8643848e7bc82827e0d5f33126d8e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.751779', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '13959e36-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': '1d6a617c72daffb57cd45d20d8bcba86e3b22eaffa08318f7d0a68f37bbc21d8'}]}, 'timestamp': '2026-01-30 09:35:55.764086', '_unique_id': 'cc75a7adc6ea45cf992999c4171c7454'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.766 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>]
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.767 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.outgoing.bytes volume: 2898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e206f611-8665-4acf-8971-a7d3ec4d63f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2898, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.767280', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '13962a86-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '171eacc65fd90ee4c1f5ce034a5c4cd5da7e7ebdc36d4795136f4c0f90a4fcfe'}]}, 'timestamp': '2026-01-30 09:35:55.767652', '_unique_id': 'f0ac702d806245599d67c60a065fad33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.768 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>]
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.769 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.latency volume: 45322366 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.770 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19907e46-84c4-4094-9e96-e223d96b2b68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45322366, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.769958', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '13969250-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'b5fe5239769c14735fbb3f6ef9c7c555a68b592a2b2748034eafbdeba8f520b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.769958', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '13969f16-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'bbfe6f8972e617468c7fc7e9a5b207e79ca387dc71a429f13ad6a93fcb2ca6ea'}]}, 'timestamp': '2026-01-30 09:35:55.770590', '_unique_id': 'b530f1d9b1bb468db095d7523938fd89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.771 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.772 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.787 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/memory.usage volume: 41.5546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33e820e9-8676-44f2-b82c-8bf46233b128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.5546875, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'timestamp': '2026-01-30T09:35:55.772379', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1399450e-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.182212375, 'message_signature': '823ce2d67047f98f6f1473c7f02b564a72049bf87961bb3a7731d3afd8cfd1a1'}]}, 'timestamp': '2026-01-30 09:35:55.787972', '_unique_id': 'cc3e7c51ce484fb594470d08ffd12b38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.788 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.789 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.789 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/cpu volume: 10780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20d766f2-3288-48e9-888f-ff7f36add602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10780000000, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'timestamp': '2026-01-30T09:35:55.789588', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '13998fb4-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.182212375, 'message_signature': 'c627330d9feed4da0512af3c8a2b67974a29f31c46cbdda5e60017c78fe4edd3'}]}, 'timestamp': '2026-01-30 09:35:55.789818', '_unique_id': 'eff26e8006574aabb93da3bd247bb16b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.790 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d883fd9-8832-4d5e-bff5-71e55db971b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.790857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '1399c0ec-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '85d457f7d6f312376f67a59de61c36679525e2ad369e789070030a246023fc40'}]}, 'timestamp': '2026-01-30 09:35:55.791083', '_unique_id': 'dea44e3c6530418bbc54e4a3ddb595ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.791 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.792 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.latency volume: 1570640251 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.792 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.read.latency volume: 35787667 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df24178e-556d-4d5d-96aa-69f6179dafbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1570640251, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.792084', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1399f27e-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'fe7d87f716b6d6f760d36a9d2773c323b5b8e9ba1db4260baab8148c8ad76f2c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35787667, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.792084', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1399fa8a-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': '3a69d55c0ffc9d002d9ee7a7eff4cc1df4e5c7ecdc05ee15b78df88bb3304da9'}]}, 'timestamp': '2026-01-30 09:35:55.792544', '_unique_id': 'e218fe96c4d5475499340ae87b32e288'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.793 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d46a452-2f29-4a20-b2bf-5c6237b61473', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.793752', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '139a31c6-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': 'd8e332633ffe7f0303a0dc4f8ec68bea8ccdde1054f8cdac992a736c423c4017'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.793752', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '139a3982-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': '4306ab16d075cbf23faeb2dc361864aa56bd6bb5690e282e13c363ad62b3e943'}]}, 'timestamp': '2026-01-30 09:35:55.794157', '_unique_id': 'd9678ca736f74cc6982482af4d65fc2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.794 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.incoming.bytes volume: 3990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ccc57c0-9289-4ed9-b36d-0643ecf38424', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3990, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.795160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '139a698e-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': 'ce422cc4b81ff6850376eda1ed67cc5e6e6cec9714937d9dc6e101f12061dc3c'}]}, 'timestamp': '2026-01-30 09:35:55.795399', '_unique_id': '74af1bd30c2544dd99023524fa89abd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.795 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.796 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd83a2127-623c-4330-afb5-67dadb2efbe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.796401', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '139a9918-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '3c33836105b7b26eac22c1eb9f9b3ce170f8e6f3688dc7e56f37bf89d1da6b9d'}]}, 'timestamp': '2026-01-30 09:35:55.796615', '_unique_id': 'df7e4f846565448a9355cac9dcd13b77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>]
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.797 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84161ff1-d751-427f-b830-5666ba71d088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.797898', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '139ad3ba-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': '48206580f18e45f2749e7735f150558a25d1507db0b71330c55043372fcdb408'}]}, 'timestamp': '2026-01-30 09:35:55.798117', '_unique_id': 'e4de3c5f71eb442c82a188340a7df5de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.798 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b43c7000-c075-4779-bb95-1d7ea6a46e91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.799092', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '139b02ea-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': 'ff9d8dd37b7c4b6410506e7ab368ace92ec6a9dc454e43e1104b0056b96e120a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.799092', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '139b0a88-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.102242972, 'message_signature': '2f617c98b4fcb014d8173ae0804e83256ecbec2e63556ff3904a74c921183ea8'}]}, 'timestamp': '2026-01-30 09:35:55.799506', '_unique_id': '0592d7c1df1e467495232f0ec8c69141'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.799 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.800 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.allocation volume: 27668480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.800 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e975b291-10e1-4aab-932d-ad3bbb25c08f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 27668480, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-vda', 'timestamp': '2026-01-30T09:35:55.800516', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '139b39c2-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': 'e9e5221d5d6109ca9433a64be985c5e1287d77040c5d617246aacaf3ac456b93'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc-sda', 'timestamp': '2026-01-30T09:35:55.800516', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'instance-00000022', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '139b430e-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.146715849, 'message_signature': '3b63b1ba15be08987114af145ccd91b9b1aa7b47e80e97e0f9d29615dab2d3c1'}]}, 'timestamp': '2026-01-30 09:35:55.800955', '_unique_id': 'c1a73e7a5fdc4deeb41fa19056546142'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.801 12 DEBUG ceilometer.compute.pollsters [-] 119ec1c9-9292-45e9-ae02-0750dc595ccc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c15e4f9-507c-4601-8b47-a3bc96890576', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_name': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_name': None, 'resource_id': 'instance-00000022-119ec1c9-9292-45e9-ae02-0750dc595ccc-tapa6bc941a-f4', 'timestamp': '2026-01-30T09:35:55.801935', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1374074683', 'name': 'tapa6bc941a-f4', 'instance_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'instance_type': 'm1.micro', 'host': '1e1ccde85b3d5ad214761670dd39a5c73be4c488ad7f4dfd4eb981e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'bbf4a612-8ed5-4e5e-9ef3-9da4690b3bc8', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:a6:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6bc941a-f4'}, 'message_id': '139b7158-fdbf-11f0-a471-fa163eabe782', 'monotonic_time': 4287.085409685, 'message_signature': 'dfdd72c1255f8b35642987d88d832d1f7e33e45b4507b938e9592d80bdee82aa'}]}, 'timestamp': '2026-01-30 09:35:55.802149', '_unique_id': '3600825d6a6d4260bf6ca3dcd54d25fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.802 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.803 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.803 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:35:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:35:55.803 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1374074683>]
Jan 30 04:35:55 np0005601977 nova_compute[183130]: 2026-01-30 09:35:55.863 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:55 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.165 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.840 183134 DEBUG nova.compute.manager [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.840 183134 DEBUG nova.compute.manager [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing instance network info cache due to event network-changed-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.840 183134 DEBUG oslo_concurrency.lockutils [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.840 183134 DEBUG oslo_concurrency.lockutils [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.841 183134 DEBUG nova.network.neutron [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Refreshing network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:35:56 np0005601977 nova_compute[183130]: 2026-01-30 09:35:56.963 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.146 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.147 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.148 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.150 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.150 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.154 183134 INFO nova.compute.manager [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Terminating instance#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.156 183134 DEBUG nova.compute.manager [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:35:57 np0005601977 kernel: tapa6bc941a-f4 (unregistering): left promiscuous mode
Jan 30 04:35:57 np0005601977 NetworkManager[55565]: <info>  [1769765757.1839] device (tapa6bc941a-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:35:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:57Z|00361|binding|INFO|Releasing lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 from this chassis (sb_readonly=0)
Jan 30 04:35:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:57Z|00362|binding|INFO|Setting lport a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 down in Southbound
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.192 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:35:57Z|00363|binding|INFO|Removing iface tapa6bc941a-f4 ovn-installed in OVS
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.196 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.201 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:a6:fc 10.100.0.12'], port_security=['fa:16:3e:70:a6:fc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '119ec1c9-9292-45e9-ae02-0750dc595ccc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8954061-53dd-4ae5-b1f3-551811d4d932', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '10c9590a-9772-4136-9d98-7fa939664eb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eeedb5f-c60c-41d6-af1d-b2452d4a440d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.202 104706 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 in datapath b8954061-53dd-4ae5-b1f3-551811d4d932 unbound from our chassis#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.204 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8954061-53dd-4ae5-b1f3-551811d4d932, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.205 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c414c6a1-dbfa-415c-80a2-c7036be142ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.206 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 namespace which is not needed anymore#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.210 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 30 04:35:57 np0005601977 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000022.scope: Consumed 12.216s CPU time.
Jan 30 04:35:57 np0005601977 systemd-machined[154431]: Machine qemu-29-instance-00000022 terminated.
Jan 30 04:35:57 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [NOTICE]   (221777) : haproxy version is 2.8.14-c23fe91
Jan 30 04:35:57 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [NOTICE]   (221777) : path to executable is /usr/sbin/haproxy
Jan 30 04:35:57 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [WARNING]  (221777) : Exiting Master process...
Jan 30 04:35:57 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [ALERT]    (221777) : Current worker (221779) exited with code 143 (Terminated)
Jan 30 04:35:57 np0005601977 neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932[221773]: [WARNING]  (221777) : All workers exited. Exiting... (0)
Jan 30 04:35:57 np0005601977 systemd[1]: libpod-ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c.scope: Deactivated successfully.
Jan 30 04:35:57 np0005601977 podman[221885]: 2026-01-30 09:35:57.348079896 +0000 UTC m=+0.045390064 container died ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:35:57 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c-userdata-shm.mount: Deactivated successfully.
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.387 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.387 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.388 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:57 np0005601977 systemd[1]: var-lib-containers-storage-overlay-1c568ccea9f2104372bada29075306d1170bc8a70de9485aff7b5f03bd712131-merged.mount: Deactivated successfully.
Jan 30 04:35:57 np0005601977 podman[221885]: 2026-01-30 09:35:57.400999967 +0000 UTC m=+0.098310125 container cleanup ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:35:57 np0005601977 systemd[1]: libpod-conmon-ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c.scope: Deactivated successfully.
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.421 183134 INFO nova.virt.libvirt.driver [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Instance destroyed successfully.#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.422 183134 DEBUG nova.objects.instance [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 119ec1c9-9292-45e9-ae02-0750dc595ccc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.441 183134 DEBUG nova.virt.libvirt.vif [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1374074683',display_name='tempest-TestNetworkAdvancedServerOps-server-1374074683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1374074683',id=34,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdPptm/HnzkqTImZqDAl7EyO2iWhxnsGpAr0rtNjW8XlU6U9DL+LFW5w4FNGYvckrrl1CqBUZRMRpzopHQZlljyrRKm4azU2AW9n9Zcwea07xun0VFkuuj0zR3/K+H88A==',key_name='tempest-TestNetworkAdvancedServerOps-15479994',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:35:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-svapo3ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:35:43Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=119ec1c9-9292-45e9-ae02-0750dc595ccc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.442 183134 DEBUG nova.network.os_vif_util [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.443 183134 DEBUG nova.network.os_vif_util [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.444 183134 DEBUG os_vif [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.447 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.448 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6bc941a-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.450 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.452 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.455 183134 INFO os_vif [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:a6:fc,bridge_name='br-int',has_traffic_filtering=True,id=a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17,network=Network(b8954061-53dd-4ae5-b1f3-551811d4d932),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc941a-f4')#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.455 183134 INFO nova.virt.libvirt.driver [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Deleting instance files /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_del#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.456 183134 INFO nova.virt.libvirt.driver [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Deletion of /var/lib/nova/instances/119ec1c9-9292-45e9-ae02-0750dc595ccc_del complete#033[00m
Jan 30 04:35:57 np0005601977 podman[221928]: 2026-01-30 09:35:57.477249493 +0000 UTC m=+0.052555921 container remove ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.481 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[679ecc0b-2648-48e6-a3cf-257e5f9571ee]: (4, ('Fri Jan 30 09:35:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 (ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c)\nac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c\nFri Jan 30 09:35:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 (ac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c)\nac6208718d68aedf7f327519944acd7659db47d78f3ae89c1dc92534fa62108c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.483 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa02413-5cb0-4e09-b7a5-18a635c8bd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.484 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8954061-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.486 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 kernel: tapb8954061-50: left promiscuous mode
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.488 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.492 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.493 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[69548f94-d9b1-4368-861e-b57e04ebe392]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.511 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e06d0179-396f-4bf6-9e06-dd39c3559634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.513 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea85141-a3ff-4309-be25-f7b168229c2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.522 183134 INFO nova.compute.manager [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.523 183134 DEBUG oslo.service.loopingcall [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.524 183134 DEBUG nova.compute.manager [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:35:57 np0005601977 nova_compute[183130]: 2026-01-30 09:35:57.524 183134 DEBUG nova.network.neutron [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.524 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4bd13a-79f5-483e-be3b-fdbed2fc843f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426899, 'reachable_time': 33853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221945, 'error': None, 'target': 'ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:57 np0005601977 systemd[1]: run-netns-ovnmeta\x2db8954061\x2d53dd\x2d4ae5\x2db1f3\x2d551811d4d932.mount: Deactivated successfully.
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.527 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8954061-53dd-4ae5-b1f3-551811d4d932 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:35:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:57.527 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e21391-1d9f-4344-bc30-4388eb52f1b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:58 np0005601977 nova_compute[183130]: 2026-01-30 09:35:58.267 183134 DEBUG nova.network.neutron [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updated VIF entry in instance network info cache for port a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:35:58 np0005601977 nova_compute[183130]: 2026-01-30 09:35:58.267 183134 DEBUG nova.network.neutron [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [{"id": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "address": "fa:16:3e:70:a6:fc", "network": {"id": "b8954061-53dd-4ae5-b1f3-551811d4d932", "bridge": "br-int", "label": "tempest-network-smoke--1522265598", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc941a-f4", "ovs_interfaceid": "a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:58 np0005601977 nova_compute[183130]: 2026-01-30 09:35:58.286 183134 DEBUG oslo_concurrency.lockutils [req-8324f358-baa1-4210-8fec-5f1058346608 req-9b18e6d3-9a83-40cc-86d4-f63596658e9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-119ec1c9-9292-45e9-ae02-0750dc595ccc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.300 183134 DEBUG nova.network.neutron [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.336 183134 INFO nova.compute.manager [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Took 1.81 seconds to deallocate network for instance.#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.419 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.420 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.430 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.494 183134 INFO nova.scheduler.client.report [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 119ec1c9-9292-45e9-ae02-0750dc595ccc#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.599 183134 DEBUG oslo_concurrency.lockutils [None req-585eb6cf-f02c-4d2f-b3b6-6d64307d6727 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "119ec1c9-9292-45e9-ae02-0750dc595ccc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:59.598 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:b4:6e 2001:db8:0:1:f816:3eff:febf:b46e 2001:db8::f816:3eff:febf:b46e'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:febf:b46e/64 2001:db8::f816:3eff:febf:b46e/64', 'neutron:device_id': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e38aee4e-ba47-49c3-9bdf-bed97e27acef, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79c5a8be-b732-4d5f-86e3-0f3d570c8b43) old=Port_Binding(mac=['fa:16:3e:bf:b4:6e 2001:db8::f816:3eff:febf:b46e'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febf:b46e/64', 'neutron:device_id': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:59.600 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79c5a8be-b732-4d5f-86e3-0f3d570c8b43 in datapath f2b07532-97d0-4974-827c-4709f0bf52f6 updated#033[00m
Jan 30 04:35:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:59.602 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2b07532-97d0-4974-827c-4709f0bf52f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:35:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:35:59.603 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[39808442-2c0a-4825-ba6f-8cf63d599ca6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:35:59 np0005601977 nova_compute[183130]: 2026-01-30 09:35:59.738 183134 DEBUG nova.compute.manager [req-bd71881d-ece2-483b-aa92-0ac991748e18 req-d9a07805-9987-4368-9827-2eccfc3aa157 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Received event network-vif-deleted-a6bc941a-f499-4e2e-9e6e-2c0ba94f2b17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:00 np0005601977 podman[221946]: 2026-01-30 09:36:00.835722751 +0000 UTC m=+0.053125938 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:36:00 np0005601977 podman[221947]: 2026-01-30 09:36:00.868142299 +0000 UTC m=+0.087895234 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:36:00 np0005601977 nova_compute[183130]: 2026-01-30 09:36:00.868 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:02 np0005601977 nova_compute[183130]: 2026-01-30 09:36:02.450 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:03 np0005601977 nova_compute[183130]: 2026-01-30 09:36:03.648 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:03 np0005601977 nova_compute[183130]: 2026-01-30 09:36:03.730 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:04 np0005601977 nova_compute[183130]: 2026-01-30 09:36:04.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:04 np0005601977 podman[221990]: 2026-01-30 09:36:04.871023288 +0000 UTC m=+0.091628972 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.091 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.092 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.092 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.092 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.264 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.265 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=73.24976348876953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.266 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.266 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.336 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.337 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.362 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.379 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.408 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.408 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:05 np0005601977 nova_compute[183130]: 2026-01-30 09:36:05.870 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.358 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.358 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.430 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.452 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.539 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.539 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.557 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.557 183134 INFO nova.compute.claims [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.691 183134 DEBUG nova.compute.provider_tree [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.715 183134 DEBUG nova.scheduler.client.report [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.744 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.745 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.804 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.805 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.825 183134 INFO nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.843 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.951 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.953 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.954 183134 INFO nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Creating image(s)#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.955 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.956 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.957 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.980 183134 DEBUG nova.policy [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:36:07 np0005601977 nova_compute[183130]: 2026-01-30 09:36:07.984 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.039 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.040 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.041 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.065 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.115 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.117 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.151 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.152 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.153 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.199 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.200 183134 DEBUG nova.virt.disk.api [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.201 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.250 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.251 183134 DEBUG nova.virt.disk.api [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.252 183134 DEBUG nova.objects.instance [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid bae9749f-c9d5-45d2-978f-c3f5a0451b9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.273 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.274 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Ensure instance console log exists: /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.274 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.275 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:08 np0005601977 nova_compute[183130]: 2026-01-30 09:36:08.275 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:10 np0005601977 nova_compute[183130]: 2026-01-30 09:36:10.870 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:11 np0005601977 nova_compute[183130]: 2026-01-30 09:36:11.260 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Successfully created port: 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:36:11 np0005601977 nova_compute[183130]: 2026-01-30 09:36:11.407 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:11 np0005601977 nova_compute[183130]: 2026-01-30 09:36:11.408 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.136 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Successfully created port: 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.419 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765757.4186366, 119ec1c9-9292-45e9-ae02-0750dc595ccc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.420 183134 INFO nova.compute.manager [-] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.454 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:12 np0005601977 nova_compute[183130]: 2026-01-30 09:36:12.479 183134 DEBUG nova.compute.manager [None req-81cbdc10-6ffc-4aa1-986b-4048e43e5464 - - - - - -] [instance: 119ec1c9-9292-45e9-ae02-0750dc595ccc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.247 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Successfully updated port: 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.358 183134 DEBUG nova.compute.manager [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-changed-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.359 183134 DEBUG nova.compute.manager [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing instance network info cache due to event network-changed-9b60f325-bf20-4165-a9bc-76eed7a0ebd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.359 183134 DEBUG oslo_concurrency.lockutils [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.360 183134 DEBUG oslo_concurrency.lockutils [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.360 183134 DEBUG nova.network.neutron [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing network info cache for port 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.364 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.364 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.365 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:13 np0005601977 podman[222034]: 2026-01-30 09:36:13.824871415 +0000 UTC m=+0.041810390 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:36:13 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.825 183134 DEBUG nova.network.neutron [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:36:14 np0005601977 nova_compute[183130]: 2026-01-30 09:36:13.998 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Successfully updated port: 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:36:14 np0005601977 nova_compute[183130]: 2026-01-30 09:36:14.017 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.032 183134 DEBUG nova.network.neutron [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.054 183134 DEBUG oslo_concurrency.lockutils [req-48b785ed-a595-419f-a5b1-97801f9e91e4 req-1c25fc95-3bbe-4e02-a8af-2532a129338d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.055 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.055 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.359 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:15 np0005601977 nova_compute[183130]: 2026-01-30 09:36:15.871 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:16 np0005601977 nova_compute[183130]: 2026-01-30 09:36:16.064 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:36:16 np0005601977 nova_compute[183130]: 2026-01-30 09:36:16.135 183134 DEBUG nova.compute.manager [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-changed-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:16 np0005601977 nova_compute[183130]: 2026-01-30 09:36:16.136 183134 DEBUG nova.compute.manager [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing instance network info cache due to event network-changed-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:36:16 np0005601977 nova_compute[183130]: 2026-01-30 09:36:16.136 183134 DEBUG oslo_concurrency.lockutils [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:17 np0005601977 nova_compute[183130]: 2026-01-30 09:36:17.456 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:20 np0005601977 nova_compute[183130]: 2026-01-30 09:36:20.878 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.095 183134 DEBUG nova.network.neutron [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updating instance_info_cache with network_info: [{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.117 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.118 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Instance network_info: |[{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.118 183134 DEBUG oslo_concurrency.lockutils [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.118 183134 DEBUG nova.network.neutron [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing network info cache for port 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.121 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Start _get_guest_xml network_info=[{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.125 183134 WARNING nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.130 183134 DEBUG nova.virt.libvirt.host [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.131 183134 DEBUG nova.virt.libvirt.host [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.134 183134 DEBUG nova.virt.libvirt.host [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.134 183134 DEBUG nova.virt.libvirt.host [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.135 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.135 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.136 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.137 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.137 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.137 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.137 183134 DEBUG nova.virt.hardware [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.140 183134 DEBUG nova.virt.libvirt.vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686712914',display_name='tempest-TestGettingAddress-server-1686712914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686712914',id=36,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-4yblfrnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:07Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=bae9749f-c9d5-45d2-978f-c3f5a0451b9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.140 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.141 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:45:bb,bridge_name='br-int',has_traffic_filtering=True,id=9b60f325-bf20-4165-a9bc-76eed7a0ebd2,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b60f325-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.141 183134 DEBUG nova.virt.libvirt.vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686712914',display_name='tempest-TestGettingAddress-server-1686712914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686712914',id=36,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-4yblfrnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:07Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=bae9749f-c9d5-45d2-978f-c3f5a0451b9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.142 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.142 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:14:d8,bridge_name='br-int',has_traffic_filtering=True,id=8968380f-68a0-46fe-aa6d-4ad70b0ce1e9,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968380f-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.143 183134 DEBUG nova.objects.instance [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid bae9749f-c9d5-45d2-978f-c3f5a0451b9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.158 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <uuid>bae9749f-c9d5-45d2-978f-c3f5a0451b9d</uuid>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <name>instance-00000024</name>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-1686712914</nova:name>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:36:21</nova:creationTime>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:port uuid="9b60f325-bf20-4165-a9bc-76eed7a0ebd2">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        <nova:port uuid="8968380f-68a0-46fe-aa6d-4ad70b0ce1e9">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe00:14d8" ipVersion="6"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe00:14d8" ipVersion="6"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="serial">bae9749f-c9d5-45d2-978f-c3f5a0451b9d</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="uuid">bae9749f-c9d5-45d2-978f-c3f5a0451b9d</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.config"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:c2:45:bb"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <target dev="tap9b60f325-bf"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:00:14:d8"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <target dev="tap8968380f-68"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/console.log" append="off"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:36:21 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:36:21 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:36:21 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:36:21 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.159 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Preparing to wait for external event network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.159 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.159 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.160 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.160 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Preparing to wait for external event network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.160 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.160 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.161 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.161 183134 DEBUG nova.virt.libvirt.vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686712914',display_name='tempest-TestGettingAddress-server-1686712914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686712914',id=36,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-4yblfrnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:07Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=bae9749f-c9d5-45d2-978f-c3f5a0451b9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.162 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.162 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:45:bb,bridge_name='br-int',has_traffic_filtering=True,id=9b60f325-bf20-4165-a9bc-76eed7a0ebd2,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b60f325-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.162 183134 DEBUG os_vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:45:bb,bridge_name='br-int',has_traffic_filtering=True,id=9b60f325-bf20-4165-a9bc-76eed7a0ebd2,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b60f325-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.163 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.163 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.165 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.165 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b60f325-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.166 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b60f325-bf, col_values=(('external_ids', {'iface-id': '9b60f325-bf20-4165-a9bc-76eed7a0ebd2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:45:bb', 'vm-uuid': 'bae9749f-c9d5-45d2-978f-c3f5a0451b9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.167 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 NetworkManager[55565]: <info>  [1769765781.1688] manager: (tap9b60f325-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.170 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.173 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.173 183134 INFO os_vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:45:bb,bridge_name='br-int',has_traffic_filtering=True,id=9b60f325-bf20-4165-a9bc-76eed7a0ebd2,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b60f325-bf')#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.174 183134 DEBUG nova.virt.libvirt.vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686712914',display_name='tempest-TestGettingAddress-server-1686712914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686712914',id=36,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-4yblfrnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:07Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=bae9749f-c9d5-45d2-978f-c3f5a0451b9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.174 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.175 183134 DEBUG nova.network.os_vif_util [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:14:d8,bridge_name='br-int',has_traffic_filtering=True,id=8968380f-68a0-46fe-aa6d-4ad70b0ce1e9,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968380f-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.175 183134 DEBUG os_vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:14:d8,bridge_name='br-int',has_traffic_filtering=True,id=8968380f-68a0-46fe-aa6d-4ad70b0ce1e9,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968380f-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.176 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.176 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.176 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.177 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.177 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8968380f-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.178 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8968380f-68, col_values=(('external_ids', {'iface-id': '8968380f-68a0-46fe-aa6d-4ad70b0ce1e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:14:d8', 'vm-uuid': 'bae9749f-c9d5-45d2-978f-c3f5a0451b9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.179 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 NetworkManager[55565]: <info>  [1769765781.1802] manager: (tap8968380f-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.181 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.183 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.184 183134 INFO os_vif [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:14:d8,bridge_name='br-int',has_traffic_filtering=True,id=8968380f-68a0-46fe-aa6d-4ad70b0ce1e9,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968380f-68')#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.242 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.243 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.243 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:c2:45:bb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.243 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:00:14:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:36:21 np0005601977 nova_compute[183130]: 2026-01-30 09:36:21.244 183134 INFO nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Using config drive#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.228 183134 INFO nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Creating config drive at /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.config#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.236 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwh4wgfb3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.357 183134 DEBUG oslo_concurrency.processutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwh4wgfb3" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:22 np0005601977 kernel: tap9b60f325-bf: entered promiscuous mode
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.3970] manager: (tap9b60f325-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00364|binding|INFO|Claiming lport 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 for this chassis.
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00365|binding|INFO|9b60f325-bf20-4165-a9bc-76eed7a0ebd2: Claiming fa:16:3e:c2:45:bb 10.100.0.3
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.398 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.405 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4066] manager: (tap8968380f-68): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 30 04:36:22 np0005601977 kernel: tap8968380f-68: entered promiscuous mode
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.409 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00366|if_status|INFO|Not updating pb chassis for 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 now as sb is readonly
Jan 30 04:36:22 np0005601977 systemd-udevd[222086]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:36:22 np0005601977 systemd-udevd[222085]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.422 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:45:bb 10.100.0.3'], port_security=['fa:16:3e:c2:45:bb 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bae9749f-c9d5-45d2-978f-c3f5a0451b9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2edea48-b03a-4c39-b516-89355e7acf87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd1977375-373b-46bc-9d23-918fa4e3324a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ae2ce11-04ba-4a06-91ff-cbcd0cf4d441, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=9b60f325-bf20-4165-a9bc-76eed7a0ebd2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00367|binding|INFO|Claiming lport 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 for this chassis.
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00368|binding|INFO|8968380f-68a0-46fe-aa6d-4ad70b0ce1e9: Claiming fa:16:3e:00:14:d8 2001:db8:0:1:f816:3eff:fe00:14d8 2001:db8::f816:3eff:fe00:14d8
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.422 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 in datapath b2edea48-b03a-4c39-b516-89355e7acf87 bound to our chassis#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.423 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2edea48-b03a-4c39-b516-89355e7acf87#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.427 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:14:d8 2001:db8:0:1:f816:3eff:fe00:14d8 2001:db8::f816:3eff:fe00:14d8'], port_security=['fa:16:3e:00:14:d8 2001:db8:0:1:f816:3eff:fe00:14d8 2001:db8::f816:3eff:fe00:14d8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe00:14d8/64 2001:db8::f816:3eff:fe00:14d8/64', 'neutron:device_id': 'bae9749f-c9d5-45d2-978f-c3f5a0451b9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd1977375-373b-46bc-9d23-918fa4e3324a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e38aee4e-ba47-49c3-9bdf-bed97e27acef, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=8968380f-68a0-46fe-aa6d-4ad70b0ce1e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4292] device (tap9b60f325-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4299] device (tap9b60f325-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4322] device (tap8968380f-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.431 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4331] device (tap8968380f-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.432 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c5322906-e5de-4667-a662-a92d663a7acd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.433 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2edea48-b1 in ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00369|binding|INFO|Setting lport 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 ovn-installed in OVS
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00370|binding|INFO|Setting lport 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 up in Southbound
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.435 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2edea48-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.435 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f400659a-c87e-4a57-ac4e-123fe4f57713]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 systemd-machined[154431]: New machine qemu-30-instance-00000024.
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.435 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[074f7bbc-094c-4b8b-bebd-a7430cc15fd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00371|binding|INFO|Setting lport 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 ovn-installed in OVS
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.438 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00372|binding|INFO|Setting lport 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 up in Southbound
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.445 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd3c852-a8ed-4cad-a9eb-7878eaf06e38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.456 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a5a35a-0a59-434c-8bcc-6e9e861d6e05]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 systemd[1]: Started Virtual Machine qemu-30-instance-00000024.
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.479 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[cc858402-c4e9-46e9-8004-9ab50e8dd94c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.483 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4a6993-bef8-4b0d-965a-548c7ccb55ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.4840] manager: (tapb2edea48-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.504 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7c222e-5b65-4736-9b0d-f47471fbf966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.507 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[25acff01-a2ef-4640-8255-b623d8ce9e82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.5240] device (tapb2edea48-b0): carrier: link connected
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.528 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0eb15c-789d-4dd4-bde7-09eba2ea3c27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.544 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c8460508-d4ae-4313-8d61-202416f0fd65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2edea48-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:be:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431386, 'reachable_time': 44442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222120, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.554 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c12ed5-75d8-4c78-be07-1fd7523b25e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:bebe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431386, 'tstamp': 431386}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222121, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.566 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[101ad8f6-aeac-41ec-a137-03a02222a9a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2edea48-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:be:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431386, 'reachable_time': 44442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222122, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.588 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ed19afd2-8282-49c2-9975-507ff16cf1ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.631 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9b02369c-6095-4714-bc59-8d655afb5052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.632 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2edea48-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.633 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.633 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2edea48-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.635 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 kernel: tapb2edea48-b0: entered promiscuous mode
Jan 30 04:36:22 np0005601977 NetworkManager[55565]: <info>  [1769765782.6364] manager: (tapb2edea48-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.639 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.640 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2edea48-b0, col_values=(('external_ids', {'iface-id': '75d877f2-b388-4f11-9237-a14c4feee2ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.642 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:22Z|00373|binding|INFO|Releasing lport 75d877f2-b388-4f11-9237-a14c4feee2ce from this chassis (sb_readonly=0)
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.650 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.651 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2edea48-b03a-4c39-b516-89355e7acf87.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2edea48-b03a-4c39-b516-89355e7acf87.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.652 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f4aad4-d454-4b73-8254-f21f12e01b3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.653 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b2edea48-b03a-4c39-b516-89355e7acf87
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b2edea48-b03a-4c39-b516-89355e7acf87.pid.haproxy
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b2edea48-b03a-4c39-b516-89355e7acf87
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:36:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:22.655 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'env', 'PROCESS_TAG=haproxy-b2edea48-b03a-4c39-b516-89355e7acf87', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2edea48-b03a-4c39-b516-89355e7acf87.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.697 183134 DEBUG nova.compute.manager [req-fafe9119-cd15-4ced-9625-95f4023ed45d req-4a9fd2e3-55d4-48fa-be9f-2539fc8e13e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.698 183134 DEBUG oslo_concurrency.lockutils [req-fafe9119-cd15-4ced-9625-95f4023ed45d req-4a9fd2e3-55d4-48fa-be9f-2539fc8e13e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.698 183134 DEBUG oslo_concurrency.lockutils [req-fafe9119-cd15-4ced-9625-95f4023ed45d req-4a9fd2e3-55d4-48fa-be9f-2539fc8e13e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.699 183134 DEBUG oslo_concurrency.lockutils [req-fafe9119-cd15-4ced-9625-95f4023ed45d req-4a9fd2e3-55d4-48fa-be9f-2539fc8e13e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.699 183134 DEBUG nova.compute.manager [req-fafe9119-cd15-4ced-9625-95f4023ed45d req-4a9fd2e3-55d4-48fa-be9f-2539fc8e13e5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Processing event network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.733 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765782.732435, bae9749f-c9d5-45d2-978f-c3f5a0451b9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.733 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] VM Started (Lifecycle Event)#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.755 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.762 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765782.7347686, bae9749f-c9d5-45d2-978f-c3f5a0451b9d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.762 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.782 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.785 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.813 183134 DEBUG nova.compute.manager [req-30bba115-8a47-4810-89bd-5f0e89ac65bc req-df9f4c59-3f13-480a-b57f-81ff16a63e64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.814 183134 DEBUG oslo_concurrency.lockutils [req-30bba115-8a47-4810-89bd-5f0e89ac65bc req-df9f4c59-3f13-480a-b57f-81ff16a63e64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.814 183134 DEBUG oslo_concurrency.lockutils [req-30bba115-8a47-4810-89bd-5f0e89ac65bc req-df9f4c59-3f13-480a-b57f-81ff16a63e64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.815 183134 DEBUG oslo_concurrency.lockutils [req-30bba115-8a47-4810-89bd-5f0e89ac65bc req-df9f4c59-3f13-480a-b57f-81ff16a63e64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.815 183134 DEBUG nova.compute.manager [req-30bba115-8a47-4810-89bd-5f0e89ac65bc req-df9f4c59-3f13-480a-b57f-81ff16a63e64 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Processing event network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.817 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.818 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.821 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765782.8216832, bae9749f-c9d5-45d2-978f-c3f5a0451b9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.822 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.826 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.832 183134 INFO nova.virt.libvirt.driver [-] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Instance spawned successfully.#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.832 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.845 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.852 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.856 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.856 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.857 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.857 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.858 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.858 183134 DEBUG nova.virt.libvirt.driver [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.869 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.911 183134 INFO nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Took 14.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.911 183134 DEBUG nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.972 183134 INFO nova.compute.manager [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Took 15.48 seconds to build instance.#033[00m
Jan 30 04:36:22 np0005601977 nova_compute[183130]: 2026-01-30 09:36:22.989 183134 DEBUG oslo_concurrency.lockutils [None req-a3b65f99-33d8-4d66-ac21-fb56f8de959b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:23 np0005601977 podman[222159]: 2026-01-30 09:36:23.00226805 +0000 UTC m=+0.058345839 container create 0771da6e8721612800851ffe63ca2f10d0522db795b329775e860b3466fdbaa4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:36:23 np0005601977 systemd[1]: Started libpod-conmon-0771da6e8721612800851ffe63ca2f10d0522db795b329775e860b3466fdbaa4.scope.
Jan 30 04:36:23 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:36:23 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/667c6795147303c082517e6f4bb75fda9b78c1879312af33796ac5d895b29602/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:36:23 np0005601977 podman[222159]: 2026-01-30 09:36:22.981135479 +0000 UTC m=+0.037213278 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:36:23 np0005601977 podman[222159]: 2026-01-30 09:36:23.070965477 +0000 UTC m=+0.127043286 container init 0771da6e8721612800851ffe63ca2f10d0522db795b329775e860b3466fdbaa4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:36:23 np0005601977 podman[222159]: 2026-01-30 09:36:23.077360372 +0000 UTC m=+0.133438161 container start 0771da6e8721612800851ffe63ca2f10d0522db795b329775e860b3466fdbaa4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 30 04:36:23 np0005601977 podman[222173]: 2026-01-30 09:36:23.083431418 +0000 UTC m=+0.055183297 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-type=git, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public)
Jan 30 04:36:23 np0005601977 podman[222174]: 2026-01-30 09:36:23.088887796 +0000 UTC m=+0.057168615 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:36:23 np0005601977 neutron-haproxy-ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87[222183]: [NOTICE]   (222218) : New worker (222221) forked
Jan 30 04:36:23 np0005601977 neutron-haproxy-ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87[222183]: [NOTICE]   (222218) : Loading success.
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.122 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 in datapath f2b07532-97d0-4974-827c-4709f0bf52f6 unbound from our chassis#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.123 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2b07532-97d0-4974-827c-4709f0bf52f6#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.131 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1cf3a7-0385-475c-bb93-9a67edfb2a58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.132 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2b07532-91 in ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.136 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2b07532-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.136 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8d423f08-29c1-41a9-afa4-7040571d230d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.137 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb5d41e-2a18-4aad-b410-ab301b5cd095]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.145 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[5b186c5b-8d63-4360-b124-67583b22c81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.154 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[29b3a887-1cfb-42c9-979d-319da6baecab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.174 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[44978857-32b4-400f-b316-fc71fb0b75f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.180 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[96156ca1-8f20-46af-bf26-21ff4c5c22a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 systemd-udevd[222111]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:36:23 np0005601977 NetworkManager[55565]: <info>  [1769765783.1838] manager: (tapf2b07532-90): new Veth device (/org/freedesktop/NetworkManager/Devices/149)
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.200 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[92f70c2d-07e7-4217-8c44-a36f207a8cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.203 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2b687b3c-1c0f-40b9-96f7-29c5f1ae9bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 NetworkManager[55565]: <info>  [1769765783.2193] device (tapf2b07532-90): carrier: link connected
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.223 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[66df7002-1911-45e5-a5c7-7ec3740581d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.238 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9f532785-8eff-4a76-9e83-2f73838f48bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2b07532-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:b4:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431455, 'reachable_time': 31800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222240, 'error': None, 'target': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.251 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c028d93d-0f24-4643-a3c0-fac57417f794]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:b46e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431455, 'tstamp': 431455}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222241, 'error': None, 'target': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.266 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[17b1896c-825f-41e8-9a58-ec516b30a1c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2b07532-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:b4:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431455, 'reachable_time': 31800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222242, 'error': None, 'target': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.290 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[51c43e54-d1de-491e-b28b-d5365756a003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.315 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee40b4a-aea1-4012-9517-afc20c395a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.316 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2b07532-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.317 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.317 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2b07532-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:23 np0005601977 nova_compute[183130]: 2026-01-30 09:36:23.319 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:23 np0005601977 NetworkManager[55565]: <info>  [1769765783.3205] manager: (tapf2b07532-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 30 04:36:23 np0005601977 kernel: tapf2b07532-90: entered promiscuous mode
Jan 30 04:36:23 np0005601977 nova_compute[183130]: 2026-01-30 09:36:23.322 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.324 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2b07532-90, col_values=(('external_ids', {'iface-id': '79c5a8be-b732-4d5f-86e3-0f3d570c8b43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:23 np0005601977 nova_compute[183130]: 2026-01-30 09:36:23.325 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:23Z|00374|binding|INFO|Releasing lport 79c5a8be-b732-4d5f-86e3-0f3d570c8b43 from this chassis (sb_readonly=0)
Jan 30 04:36:23 np0005601977 nova_compute[183130]: 2026-01-30 09:36:23.326 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.326 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2b07532-97d0-4974-827c-4709f0bf52f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2b07532-97d0-4974-827c-4709f0bf52f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.327 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[acf6c372-b440-4e31-89a5-573f1e35273f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.328 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-f2b07532-97d0-4974-827c-4709f0bf52f6
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/f2b07532-97d0-4974-827c-4709f0bf52f6.pid.haproxy
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID f2b07532-97d0-4974-827c-4709f0bf52f6
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:36:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:23.329 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'env', 'PROCESS_TAG=haproxy-f2b07532-97d0-4974-827c-4709f0bf52f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2b07532-97d0-4974-827c-4709f0bf52f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:36:23 np0005601977 nova_compute[183130]: 2026-01-30 09:36:23.330 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:23 np0005601977 podman[222272]: 2026-01-30 09:36:23.701234851 +0000 UTC m=+0.044105667 container create 412a1bb455ec898c7ad68bf15c4b040d203aed745bd202d0c061ea53f533169f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 30 04:36:23 np0005601977 systemd[1]: Started libpod-conmon-412a1bb455ec898c7ad68bf15c4b040d203aed745bd202d0c061ea53f533169f.scope.
Jan 30 04:36:23 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:36:23 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/811ff601878c860ad9ff6cbcf062aa8aacc2dff58d484d86e1868409f0631517/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:36:23 np0005601977 podman[222272]: 2026-01-30 09:36:23.677751661 +0000 UTC m=+0.020622497 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:36:23 np0005601977 podman[222272]: 2026-01-30 09:36:23.772718138 +0000 UTC m=+0.115588954 container init 412a1bb455ec898c7ad68bf15c4b040d203aed745bd202d0c061ea53f533169f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:36:23 np0005601977 podman[222272]: 2026-01-30 09:36:23.777597209 +0000 UTC m=+0.120468025 container start 412a1bb455ec898c7ad68bf15c4b040d203aed745bd202d0c061ea53f533169f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:36:23 np0005601977 neutron-haproxy-ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6[222288]: [NOTICE]   (222292) : New worker (222294) forked
Jan 30 04:36:23 np0005601977 neutron-haproxy-ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6[222288]: [NOTICE]   (222292) : Loading success.
Jan 30 04:36:24 np0005601977 nova_compute[183130]: 2026-01-30 09:36:24.133 183134 DEBUG nova.network.neutron [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updated VIF entry in instance network info cache for port 8968380f-68a0-46fe-aa6d-4ad70b0ce1e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:36:24 np0005601977 nova_compute[183130]: 2026-01-30 09:36:24.134 183134 DEBUG nova.network.neutron [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updating instance_info_cache with network_info: [{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:24 np0005601977 nova_compute[183130]: 2026-01-30 09:36:24.158 183134 DEBUG oslo_concurrency.lockutils [req-860ba289-fe50-4b11-b774-ab4161637090 req-6eef1c43-011e-4ca3-b451-bc24d5f495ac dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.465 183134 DEBUG nova.compute.manager [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.466 183134 DEBUG oslo_concurrency.lockutils [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.466 183134 DEBUG oslo_concurrency.lockutils [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.466 183134 DEBUG oslo_concurrency.lockutils [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.466 183134 DEBUG nova.compute.manager [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] No waiting events found dispatching network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.467 183134 WARNING nova.compute.manager [req-adcd86c0-7980-4caa-8791-91f42c44436b req-0b0f1e47-7817-4aa6-8530-3ad981cc6ef6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received unexpected event network-vif-plugged-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.584 183134 DEBUG nova.compute.manager [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.584 183134 DEBUG oslo_concurrency.lockutils [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.585 183134 DEBUG oslo_concurrency.lockutils [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.585 183134 DEBUG oslo_concurrency.lockutils [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "bae9749f-c9d5-45d2-978f-c3f5a0451b9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.585 183134 DEBUG nova.compute.manager [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] No waiting events found dispatching network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.585 183134 WARNING nova.compute.manager [req-d474d69a-4aef-4690-a373-2b10fbbfe159 req-0b7de85a-21bd-4f62-9892-60ed32f89e77 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received unexpected event network-vif-plugged-8968380f-68a0-46fe-aa6d-4ad70b0ce1e9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:36:25 np0005601977 nova_compute[183130]: 2026-01-30 09:36:25.880 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:26 np0005601977 nova_compute[183130]: 2026-01-30 09:36:26.180 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:26 np0005601977 nova_compute[183130]: 2026-01-30 09:36:26.386 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:26 np0005601977 NetworkManager[55565]: <info>  [1769765786.3865] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 30 04:36:26 np0005601977 NetworkManager[55565]: <info>  [1769765786.3871] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 30 04:36:26 np0005601977 nova_compute[183130]: 2026-01-30 09:36:26.408 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:26 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:26Z|00375|binding|INFO|Releasing lport 75d877f2-b388-4f11-9237-a14c4feee2ce from this chassis (sb_readonly=0)
Jan 30 04:36:26 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:26Z|00376|binding|INFO|Releasing lport 79c5a8be-b732-4d5f-86e3-0f3d570c8b43 from this chassis (sb_readonly=0)
Jan 30 04:36:26 np0005601977 nova_compute[183130]: 2026-01-30 09:36:26.422 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.432 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.433 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.452 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.545 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.545 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.552 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.552 183134 INFO nova.compute.claims [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.651 183134 DEBUG nova.compute.manager [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Received event network-changed-9b60f325-bf20-4165-a9bc-76eed7a0ebd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.652 183134 DEBUG nova.compute.manager [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing instance network info cache due to event network-changed-9b60f325-bf20-4165-a9bc-76eed7a0ebd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.653 183134 DEBUG oslo_concurrency.lockutils [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.653 183134 DEBUG oslo_concurrency.lockutils [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.653 183134 DEBUG nova.network.neutron [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Refreshing network info cache for port 9b60f325-bf20-4165-a9bc-76eed7a0ebd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.685 183134 DEBUG nova.compute.provider_tree [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.701 183134 DEBUG nova.scheduler.client.report [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.721 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.722 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.763 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.764 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.785 183134 INFO nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.804 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.896 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.898 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.899 183134 INFO nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Creating image(s)#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.900 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.900 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.901 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.914 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.983 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.984 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.985 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:27 np0005601977 nova_compute[183130]: 2026-01-30 09:36:27.997 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.051 183134 DEBUG nova.policy [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67d560d0067b4b56aa346073fcc16d6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.065 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.067 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.100 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.102 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.102 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.166 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.168 183134 DEBUG nova.virt.disk.api [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.169 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.224 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.225 183134 DEBUG nova.virt.disk.api [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.225 183134 DEBUG nova.objects.instance [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.249 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.250 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Ensure instance console log exists: /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.251 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.251 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:28 np0005601977 nova_compute[183130]: 2026-01-30 09:36:28.251 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.073 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Successfully created port: ce687f2c-b090-4884-959e-0d0e5154ace0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.293 183134 DEBUG nova.network.neutron [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updated VIF entry in instance network info cache for port 9b60f325-bf20-4165-a9bc-76eed7a0ebd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.294 183134 DEBUG nova.network.neutron [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updating instance_info_cache with network_info: [{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.314 183134 DEBUG oslo_concurrency.lockutils [req-3bc27de5-2768-441e-8f0d-f4f1716b26d4 req-22b1ac79-3376-4856-9c09-ef7c91698097 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.769 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Successfully updated port: ce687f2c-b090-4884-959e-0d0e5154ace0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.796 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.797 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.797 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.863 183134 DEBUG nova.compute.manager [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-changed-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.864 183134 DEBUG nova.compute.manager [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Refreshing instance network info cache due to event network-changed-ce687f2c-b090-4884-959e-0d0e5154ace0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:36:29 np0005601977 nova_compute[183130]: 2026-01-30 09:36:29.864 183134 DEBUG oslo_concurrency.lockutils [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.002 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.793 183134 DEBUG nova.network.neutron [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updating instance_info_cache with network_info: [{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.832 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.833 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance network_info: |[{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.834 183134 DEBUG oslo_concurrency.lockutils [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.835 183134 DEBUG nova.network.neutron [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Refreshing network info cache for port ce687f2c-b090-4884-959e-0d0e5154ace0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.839 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start _get_guest_xml network_info=[{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.846 183134 WARNING nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.850 183134 DEBUG nova.virt.libvirt.host [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.851 183134 DEBUG nova.virt.libvirt.host [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.862 183134 DEBUG nova.virt.libvirt.host [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.863 183134 DEBUG nova.virt.libvirt.host [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.864 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.864 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.865 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.865 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.866 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.866 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.866 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.867 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.867 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.867 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.868 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.868 183134 DEBUG nova.virt.hardware [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.872 183134 DEBUG nova.virt.libvirt.vif [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:27Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.873 183134 DEBUG nova.network.os_vif_util [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.873 183134 DEBUG nova.network.os_vif_util [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.874 183134 DEBUG nova.objects.instance [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.893 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <uuid>e8bcd6f1-636f-4cb1-8133-fa91df48fe59</uuid>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <name>instance-00000025</name>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1216546466</nova:name>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:36:30</nova:creationTime>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        <nova:port uuid="ce687f2c-b090-4884-959e-0d0e5154ace0">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="serial">e8bcd6f1-636f-4cb1-8133-fa91df48fe59</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="uuid">e8bcd6f1-636f-4cb1-8133-fa91df48fe59</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:4d:a0:0d"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <target dev="tapce687f2c-b0"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/console.log" append="off"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:36:30 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:36:30 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:36:30 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:36:30 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.899 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Preparing to wait for external event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.900 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.900 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.900 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.901 183134 DEBUG nova.virt.libvirt.vif [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:27Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.902 183134 DEBUG nova.network.os_vif_util [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.902 183134 DEBUG nova.network.os_vif_util [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.903 183134 DEBUG os_vif [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.903 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.904 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.904 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.908 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.909 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce687f2c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.909 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce687f2c-b0, col_values=(('external_ids', {'iface-id': 'ce687f2c-b090-4884-959e-0d0e5154ace0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:a0:0d', 'vm-uuid': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.911 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:30 np0005601977 NetworkManager[55565]: <info>  [1769765790.9125] manager: (tapce687f2c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.913 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.922 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.923 183134 INFO os_vif [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0')#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.976 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.976 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.976 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] No VIF found with MAC fa:16:3e:4d:a0:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:36:30 np0005601977 nova_compute[183130]: 2026-01-30 09:36:30.977 183134 INFO nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Using config drive#033[00m
Jan 30 04:36:31 np0005601977 podman[222321]: 2026-01-30 09:36:31.029273304 +0000 UTC m=+0.074904448 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:36:31 np0005601977 podman[222323]: 2026-01-30 09:36:31.033233769 +0000 UTC m=+0.073852148 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:36:33 np0005601977 nova_compute[183130]: 2026-01-30 09:36:33.548 183134 INFO nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Creating config drive at /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config#033[00m
Jan 30 04:36:33 np0005601977 nova_compute[183130]: 2026-01-30 09:36:33.551 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsrf6l4zv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:45:bb 10.100.0.3
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:45:bb 10.100.0.3
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.373 183134 DEBUG oslo_concurrency.processutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsrf6l4zv" returned: 0 in 0.822s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:36:34 np0005601977 kernel: tapce687f2c-b0: entered promiscuous mode
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.414 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00377|binding|INFO|Claiming lport ce687f2c-b090-4884-959e-0d0e5154ace0 for this chassis.
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00378|binding|INFO|ce687f2c-b090-4884-959e-0d0e5154ace0: Claiming fa:16:3e:4d:a0:0d 10.100.0.14
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.417 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.4193] manager: (tapce687f2c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.419 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00379|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 ovn-installed in OVS
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.426 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00380|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 up in Southbound
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.441 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:a0:0d 10.100.0.14'], port_security=['fa:16:3e:4d:a0:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a324d25-aadb-48ba-b761-6712d942455e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a67ba197-5b77-4e39-9974-dcaa8c946237', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=012b2bd2-c801-40d7-8aac-bd8116324a2b, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=ce687f2c-b090-4884-959e-0d0e5154ace0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:36:34 np0005601977 systemd-udevd[222399]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:36:34 np0005601977 systemd-machined[154431]: New machine qemu-31-instance-00000025.
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.443 104706 INFO neutron.agent.ovn.metadata.agent [-] Port ce687f2c-b090-4884-959e-0d0e5154ace0 in datapath 0a324d25-aadb-48ba-b761-6712d942455e bound to our chassis#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.446 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a324d25-aadb-48ba-b761-6712d942455e#033[00m
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.4502] device (tapce687f2c-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.4506] device (tapce687f2c-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:36:34 np0005601977 systemd[1]: Started Virtual Machine qemu-31-instance-00000025.
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.457 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9201ea-04c9-4937-b578-2afcb6ff6645]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.458 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a324d25-a1 in ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.460 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a324d25-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.460 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[95bdd22d-b633-49e1-a105-9a578455cdf2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.462 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[083cd119-3bce-40e0-9c3f-7653082f21a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.472 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[a17f1883-c3b1-432b-9803-06897bc57255]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.492 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[02c4781c-32a7-4689-aef7-5a4a7472a399]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.512 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[40f0071a-7a24-4deb-9ec7-32cd37002655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.5192] manager: (tap0a324d25-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.518 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f840be-6b16-447b-8941-093f8a00ca88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 systemd-udevd[222402]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.540 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0236e3-f644-4cd0-a31b-3c28bae29a8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.543 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[94d2bc00-9f2e-491c-be6d-27a33343a264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.5574] device (tap0a324d25-a0): carrier: link connected
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.561 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[782c57ec-facc-471a-a861-9700a79b6796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.574 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[03e5d4c1-052a-451a-ace8-e920d80bd16c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a324d25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:e2:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432589, 'reachable_time': 34856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222432, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.586 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f476309f-588d-420a-8c94-361422d1affe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:e279'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432589, 'tstamp': 432589}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222433, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.596 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f7396a4e-7ebc-4b09-acaf-139a05256cda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a324d25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:e2:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432589, 'reachable_time': 34856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222434, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.624 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[29a11c95-49c7-4a67-8375-dfb948a99f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.664 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0e350830-bbe6-41a9-b86a-90a3b0ca3cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.665 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a324d25-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.665 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.666 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a324d25-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.667 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 kernel: tap0a324d25-a0: entered promiscuous mode
Jan 30 04:36:34 np0005601977 NetworkManager[55565]: <info>  [1769765794.6680] manager: (tap0a324d25-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.669 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.670 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a324d25-a0, col_values=(('external_ids', {'iface-id': '4cd221df-f660-4aff-810f-059c8ba15ad5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.671 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:34Z|00381|binding|INFO|Releasing lport 4cd221df-f660-4aff-810f-059c8ba15ad5 from this chassis (sb_readonly=0)
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.673 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.673 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.674 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a978e1-918c-4be7-9e5a-3aa67a4b4682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.674 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-0a324d25-aadb-48ba-b761-6712d942455e
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 0a324d25-aadb-48ba-b761-6712d942455e
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:36:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:34.675 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'env', 'PROCESS_TAG=haproxy-0a324d25-aadb-48ba-b761-6712d942455e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a324d25-aadb-48ba-b761-6712d942455e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.678 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.825 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765794.825369, e8bcd6f1-636f-4cb1-8133-fa91df48fe59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.826 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] VM Started (Lifecycle Event)#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.868 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.872 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765794.8255906, e8bcd6f1-636f-4cb1-8133-fa91df48fe59 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.873 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.931 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.934 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:36:34 np0005601977 nova_compute[183130]: 2026-01-30 09:36:34.954 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:36:34 np0005601977 podman[222473]: 2026-01-30 09:36:34.973455326 +0000 UTC m=+0.047981099 container create 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:36:34 np0005601977 systemd[1]: Started libpod-conmon-70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6.scope.
Jan 30 04:36:35 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:36:35 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e412282b4229448a96b15d69ff4889b50627dc216124399eea0c32cff672e2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:36:35 np0005601977 podman[222473]: 2026-01-30 09:36:35.021748053 +0000 UTC m=+0.096273836 container init 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:36:35 np0005601977 podman[222473]: 2026-01-30 09:36:35.027596022 +0000 UTC m=+0.102121785 container start 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:36:35 np0005601977 podman[222473]: 2026-01-30 09:36:34.952353875 +0000 UTC m=+0.026879678 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:36:35 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [NOTICE]   (222503) : New worker (222512) forked
Jan 30 04:36:35 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [NOTICE]   (222503) : Loading success.
Jan 30 04:36:35 np0005601977 podman[222486]: 2026-01-30 09:36:35.070945046 +0000 UTC m=+0.068738420 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.101 183134 DEBUG nova.network.neutron [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updated VIF entry in instance network info cache for port ce687f2c-b090-4884-959e-0d0e5154ace0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.102 183134 DEBUG nova.network.neutron [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updating instance_info_cache with network_info: [{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.122 183134 DEBUG oslo_concurrency.lockutils [req-420fd230-1fb6-48fa-9412-837a599ca120 req-1b44336e-a946-4e37-8b93-91451347141c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.297 183134 DEBUG nova.compute.manager [req-18571263-f940-4435-a65e-4b6e5d6f2664 req-4a24734d-2abb-4b69-bfe6-02e1d6b00f65 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.297 183134 DEBUG oslo_concurrency.lockutils [req-18571263-f940-4435-a65e-4b6e5d6f2664 req-4a24734d-2abb-4b69-bfe6-02e1d6b00f65 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.297 183134 DEBUG oslo_concurrency.lockutils [req-18571263-f940-4435-a65e-4b6e5d6f2664 req-4a24734d-2abb-4b69-bfe6-02e1d6b00f65 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.298 183134 DEBUG oslo_concurrency.lockutils [req-18571263-f940-4435-a65e-4b6e5d6f2664 req-4a24734d-2abb-4b69-bfe6-02e1d6b00f65 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.298 183134 DEBUG nova.compute.manager [req-18571263-f940-4435-a65e-4b6e5d6f2664 req-4a24734d-2abb-4b69-bfe6-02e1d6b00f65 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Processing event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.298 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.301 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765795.301489, e8bcd6f1-636f-4cb1-8133-fa91df48fe59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.302 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.303 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.312 183134 INFO nova.virt.libvirt.driver [-] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance spawned successfully.#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.312 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.317 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.320 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.341 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.342 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.342 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.342 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.343 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.343 183134 DEBUG nova.virt.libvirt.driver [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.347 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.425 183134 INFO nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Took 7.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.426 183134 DEBUG nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.558 183134 INFO nova.compute.manager [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Took 8.06 seconds to build instance.#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.627 183134 DEBUG oslo_concurrency.lockutils [None req-9eb51fe5-bde8-45ae-9869-95ac596b6ddc 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.914 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:35 np0005601977 nova_compute[183130]: 2026-01-30 09:36:35.916 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.384 183134 DEBUG nova.compute.manager [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.384 183134 DEBUG oslo_concurrency.lockutils [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.384 183134 DEBUG oslo_concurrency.lockutils [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.385 183134 DEBUG oslo_concurrency.lockutils [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.385 183134 DEBUG nova.compute.manager [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] No waiting events found dispatching network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:36:37 np0005601977 nova_compute[183130]: 2026-01-30 09:36:37.386 183134 WARNING nova.compute.manager [req-a0e6ba04-6902-4d5d-9f28-85157f07ee7d req-6004326a-d8fe-4c17-b80e-e2f933da07b3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received unexpected event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.398 183134 DEBUG nova.compute.manager [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-changed-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.399 183134 DEBUG nova.compute.manager [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Refreshing instance network info cache due to event network-changed-ce687f2c-b090-4884-959e-0d0e5154ace0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.399 183134 DEBUG oslo_concurrency.lockutils [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.400 183134 DEBUG oslo_concurrency.lockutils [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.400 183134 DEBUG nova.network.neutron [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Refreshing network info cache for port ce687f2c-b090-4884-959e-0d0e5154ace0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:36:40 np0005601977 nova_compute[183130]: 2026-01-30 09:36:40.917 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:43 np0005601977 nova_compute[183130]: 2026-01-30 09:36:43.294 183134 DEBUG nova.network.neutron [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updated VIF entry in instance network info cache for port ce687f2c-b090-4884-959e-0d0e5154ace0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:36:43 np0005601977 nova_compute[183130]: 2026-01-30 09:36:43.296 183134 DEBUG nova.network.neutron [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updating instance_info_cache with network_info: [{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:43 np0005601977 nova_compute[183130]: 2026-01-30 09:36:43.331 183134 DEBUG oslo_concurrency.lockutils [req-0e57083c-b816-4002-a33b-d5c5df3eaf28 req-752a0550-a0eb-408e-a8c4-a385101bf86f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:44 np0005601977 podman[222527]: 2026-01-30 09:36:44.824018155 +0000 UTC m=+0.045998672 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:36:45 np0005601977 nova_compute[183130]: 2026-01-30 09:36:45.919 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:46Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4d:a0:0d 10.100.0.14
Jan 30 04:36:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:46Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4d:a0:0d 10.100.0.14
Jan 30 04:36:50 np0005601977 nova_compute[183130]: 2026-01-30 09:36:50.922 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:52 np0005601977 nova_compute[183130]: 2026-01-30 09:36:52.867 183134 INFO nova.compute.manager [None req-fec94948-e94b-41d9-b693-6f3375338623 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Get console output#033[00m
Jan 30 04:36:52 np0005601977 nova_compute[183130]: 2026-01-30 09:36:52.872 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.151 183134 DEBUG oslo_concurrency.lockutils [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.152 183134 DEBUG oslo_concurrency.lockutils [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.152 183134 DEBUG nova.compute.manager [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.155 183134 DEBUG nova.compute.manager [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.156 183134 DEBUG nova.objects.instance [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'flavor' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:53 np0005601977 nova_compute[183130]: 2026-01-30 09:36:53.180 183134 DEBUG nova.virt.libvirt.driver [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 30 04:36:53 np0005601977 podman[222564]: 2026-01-30 09:36:53.822426641 +0000 UTC m=+0.040958015 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, name=ubi9/ubi-minimal, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:36:53 np0005601977 podman[222565]: 2026-01-30 09:36:53.834879122 +0000 UTC m=+0.050597295 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:36:55 np0005601977 kernel: tapce687f2c-b0 (unregistering): left promiscuous mode
Jan 30 04:36:55 np0005601977 NetworkManager[55565]: <info>  [1769765815.3226] device (tapce687f2c-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:36:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:55Z|00382|binding|INFO|Releasing lport ce687f2c-b090-4884-959e-0d0e5154ace0 from this chassis (sb_readonly=0)
Jan 30 04:36:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:55Z|00383|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 down in Southbound
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.332 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:36:55Z|00384|binding|INFO|Removing iface tapce687f2c-b0 ovn-installed in OVS
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.335 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.340 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.341 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.341 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.342 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.344 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:a0:0d 10.100.0.14'], port_security=['fa:16:3e:4d:a0:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a324d25-aadb-48ba-b761-6712d942455e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a67ba197-5b77-4e39-9974-dcaa8c946237', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=012b2bd2-c801-40d7-8aac-bd8116324a2b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=ce687f2c-b090-4884-959e-0d0e5154ace0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.345 104706 INFO neutron.agent.ovn.metadata.agent [-] Port ce687f2c-b090-4884-959e-0d0e5154ace0 in datapath 0a324d25-aadb-48ba-b761-6712d942455e unbound from our chassis#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.347 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a324d25-aadb-48ba-b761-6712d942455e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.348 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[215994fd-a5d5-4615-babf-12402086b3f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.348 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e namespace which is not needed anymore#033[00m
Jan 30 04:36:55 np0005601977 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 30 04:36:55 np0005601977 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000025.scope: Consumed 12.032s CPU time.
Jan 30 04:36:55 np0005601977 systemd-machined[154431]: Machine qemu-31-instance-00000025 terminated.
Jan 30 04:36:55 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [NOTICE]   (222503) : haproxy version is 2.8.14-c23fe91
Jan 30 04:36:55 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [NOTICE]   (222503) : path to executable is /usr/sbin/haproxy
Jan 30 04:36:55 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [WARNING]  (222503) : Exiting Master process...
Jan 30 04:36:55 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [ALERT]    (222503) : Current worker (222512) exited with code 143 (Terminated)
Jan 30 04:36:55 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222489]: [WARNING]  (222503) : All workers exited. Exiting... (0)
Jan 30 04:36:55 np0005601977 systemd[1]: libpod-70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6.scope: Deactivated successfully.
Jan 30 04:36:55 np0005601977 podman[222628]: 2026-01-30 09:36:55.495968875 +0000 UTC m=+0.045306981 container died 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:36:55 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6-userdata-shm.mount: Deactivated successfully.
Jan 30 04:36:55 np0005601977 systemd[1]: var-lib-containers-storage-overlay-9e412282b4229448a96b15d69ff4889b50627dc216124399eea0c32cff672e2f-merged.mount: Deactivated successfully.
Jan 30 04:36:55 np0005601977 podman[222628]: 2026-01-30 09:36:55.530196316 +0000 UTC m=+0.079534432 container cleanup 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:36:55 np0005601977 systemd[1]: libpod-conmon-70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6.scope: Deactivated successfully.
Jan 30 04:36:55 np0005601977 kernel: tapce687f2c-b0: entered promiscuous mode
Jan 30 04:36:55 np0005601977 kernel: tapce687f2c-b0 (unregistering): left promiscuous mode
Jan 30 04:36:55 np0005601977 NetworkManager[55565]: <info>  [1769765815.5582] manager: (tapce687f2c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.566 183134 DEBUG nova.compute.manager [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-unplugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.568 183134 DEBUG oslo_concurrency.lockutils [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.568 183134 DEBUG oslo_concurrency.lockutils [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.568 183134 DEBUG oslo_concurrency.lockutils [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.568 183134 DEBUG nova.compute.manager [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] No waiting events found dispatching network-vif-unplugged-ce687f2c-b090-4884-959e-0d0e5154ace0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.568 183134 WARNING nova.compute.manager [req-0438de27-7069-45f2-a657-0d5109a7cbaf req-9faacc4d-7614-4860-bcac-59721efd2dcc dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received unexpected event network-vif-unplugged-ce687f2c-b090-4884-959e-0d0e5154ace0 for instance with vm_state active and task_state powering-off.#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 podman[222655]: 2026-01-30 09:36:55.630045194 +0000 UTC m=+0.081727765 container remove 70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.634 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1cacdb9e-17e5-4d9d-8a80-0d5e11d66a91]: (4, ('Fri Jan 30 09:36:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e (70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6)\n70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6\nFri Jan 30 09:36:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e (70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6)\n70149437a972099363c8ca6461c1832352c9334e731702391f81b2f9c77ce8b6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.636 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ee9b41-6b84-4443-ab3b-c4ac9aa18ff6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.637 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a324d25-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:55 np0005601977 kernel: tap0a324d25-a0: left promiscuous mode
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.645 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.650 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c163c0f6-6846-4fd7-9239-3b312d151af9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.668 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b61ed1e8-e1bf-42a2-a0f5-4f469e142f15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.670 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[581302e4-18e5-42f0-b673-b64a1dae1a5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.684 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8f507a18-f11f-43ce-9495-41f708b61de8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432585, 'reachable_time': 35956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222688, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.688 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:36:55 np0005601977 systemd[1]: run-netns-ovnmeta\x2d0a324d25\x2daadb\x2d48ba\x2db761\x2d6712d942455e.mount: Deactivated successfully.
Jan 30 04:36:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:55.689 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[7e418e72-7b32-49f6-b80f-a537b3b81f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:55 np0005601977 nova_compute[183130]: 2026-01-30 09:36:55.925 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:56 np0005601977 nova_compute[183130]: 2026-01-30 09:36:56.201 183134 INFO nova.virt.libvirt.driver [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance shutdown successfully after 3 seconds.#033[00m
Jan 30 04:36:56 np0005601977 nova_compute[183130]: 2026-01-30 09:36:56.207 183134 INFO nova.virt.libvirt.driver [-] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance destroyed successfully.#033[00m
Jan 30 04:36:56 np0005601977 nova_compute[183130]: 2026-01-30 09:36:56.207 183134 DEBUG nova.objects.instance [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'numa_topology' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:56 np0005601977 nova_compute[183130]: 2026-01-30 09:36:56.220 183134 DEBUG nova.compute.manager [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:36:56 np0005601977 nova_compute[183130]: 2026-01-30 09:36:56.258 183134 DEBUG oslo_concurrency.lockutils [None req-6a892504-3456-4455-95e7-da79ca379b46 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:57.388 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:57.389 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:57.389 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.694 183134 DEBUG nova.compute.manager [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.694 183134 DEBUG oslo_concurrency.lockutils [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.694 183134 DEBUG oslo_concurrency.lockutils [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.695 183134 DEBUG oslo_concurrency.lockutils [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.695 183134 DEBUG nova.compute.manager [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] No waiting events found dispatching network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.695 183134 WARNING nova.compute.manager [req-fc42864c-7189-4069-89f4-c6ce3dc48e77 req-b3d5dd61-667d-4972-b7d9-5c24cd08f371 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received unexpected event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 for instance with vm_state stopped and task_state None.#033[00m
Jan 30 04:36:57 np0005601977 nova_compute[183130]: 2026-01-30 09:36:57.946 183134 INFO nova.compute.manager [None req-bbfafd3e-15bf-40f8-983c-e707e8c1db36 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Get console output#033[00m
Jan 30 04:36:58 np0005601977 nova_compute[183130]: 2026-01-30 09:36:58.234 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'flavor' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:58 np0005601977 nova_compute[183130]: 2026-01-30 09:36:58.448 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:36:58 np0005601977 nova_compute[183130]: 2026-01-30 09:36:58.449 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:36:58 np0005601977 nova_compute[183130]: 2026-01-30 09:36:58.449 183134 DEBUG nova.network.neutron [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:36:58 np0005601977 nova_compute[183130]: 2026-01-30 09:36:58.450 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'info_cache' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:36:59.345 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.681 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.681 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.703 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.822 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.823 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.832 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.833 183134 INFO nova.compute.claims [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.840 183134 DEBUG nova.network.neutron [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Updating instance_info_cache with network_info: [{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.899 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-e8bcd6f1-636f-4cb1-8133-fa91df48fe59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.936 183134 INFO nova.virt.libvirt.driver [-] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance destroyed successfully.#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.937 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'numa_topology' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.953 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.967 183134 DEBUG nova.virt.libvirt.vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:36:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.968 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.968 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.969 183134 DEBUG os_vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.970 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.971 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce687f2c-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.972 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.974 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.976 183134 INFO os_vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0')#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.985 183134 DEBUG nova.virt.libvirt.driver [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start _get_guest_xml network_info=[{"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.988 183134 WARNING nova.virt.libvirt.driver [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.995 183134 DEBUG nova.virt.libvirt.host [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:36:59 np0005601977 nova_compute[183130]: 2026-01-30 09:36:59.995 183134 DEBUG nova.virt.libvirt.host [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.004 183134 DEBUG nova.virt.libvirt.host [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.005 183134 DEBUG nova.virt.libvirt.host [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.006 183134 DEBUG nova.virt.libvirt.driver [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.006 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.006 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.007 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.007 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.007 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.007 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.007 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.008 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.008 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.008 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.008 183134 DEBUG nova.virt.hardware [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.009 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.031 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.051 183134 DEBUG nova.compute.provider_tree [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.074 183134 DEBUG nova.scheduler.client.report [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.088 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.089 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.089 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.089 183134 DEBUG oslo_concurrency.lockutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.090 183134 DEBUG nova.virt.libvirt.vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:36:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.090 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.091 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.092 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.104 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.104 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.116 183134 DEBUG nova.virt.libvirt.driver [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <uuid>e8bcd6f1-636f-4cb1-8133-fa91df48fe59</uuid>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <name>instance-00000025</name>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1216546466</nova:name>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:36:59</nova:creationTime>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:user uuid="67d560d0067b4b56aa346073fcc16d6d">tempest-TestNetworkAdvancedServerOps-856785562-project-member</nova:user>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:project uuid="3f3fcd6f23d74ceca8c3efd31a373f0b">tempest-TestNetworkAdvancedServerOps-856785562</nova:project>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        <nova:port uuid="ce687f2c-b090-4884-959e-0d0e5154ace0">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="serial">e8bcd6f1-636f-4cb1-8133-fa91df48fe59</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="uuid">e8bcd6f1-636f-4cb1-8133-fa91df48fe59</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk.config"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:4d:a0:0d"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <target dev="tapce687f2c-b0"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/console.log" append="off"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <input type="keyboard" bus="usb"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:37:00 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:37:00 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:37:00 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:37:00 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.117 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.156 183134 INFO nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.159 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.159 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.169 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.169 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.181 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.214 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.215 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'trusted_certs' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.240 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.307 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.309 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.309 183134 INFO nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Creating image(s)#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.310 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.310 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.310 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.325 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.325 183134 DEBUG nova.virt.disk.api [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Checking if we can resize image /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.325 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.340 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.376 183134 DEBUG oslo_concurrency.processutils [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.377 183134 DEBUG nova.virt.disk.api [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Cannot resize image /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.377 183134 DEBUG nova.objects.instance [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'migration_context' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.383 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.384 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.385 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.394 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.440 183134 DEBUG nova.virt.libvirt.vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:36:56Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.440 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.441 183134 DEBUG nova.network.os_vif_util [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.442 183134 DEBUG os_vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.443 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.444 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.444 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.448 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.449 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce687f2c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.449 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce687f2c-b0, col_values=(('external_ids', {'iface-id': 'ce687f2c-b090-4884-959e-0d0e5154ace0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:a0:0d', 'vm-uuid': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.451 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.451 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.480 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.481 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.483 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.4929] manager: (tapce687f2c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.499 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.502 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.503 183134 INFO os_vif [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0')#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.527 183134 DEBUG nova.policy [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.532 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.533 183134 DEBUG nova.virt.disk.api [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.533 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:00 np0005601977 kernel: tapce687f2c-b0: entered promiscuous mode
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.5684] manager: (tapce687f2c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.568 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:00Z|00385|binding|INFO|Claiming lport ce687f2c-b090-4884-959e-0d0e5154ace0 for this chassis.
Jan 30 04:37:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:00Z|00386|binding|INFO|ce687f2c-b090-4884-959e-0d0e5154ace0: Claiming fa:16:3e:4d:a0:0d 10.100.0.14
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.570 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:00Z|00387|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 ovn-installed in OVS
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.574 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.578 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:00Z|00388|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 up in Southbound
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.579 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:a0:0d 10.100.0.14'], port_security=['fa:16:3e:4d:a0:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a324d25-aadb-48ba-b761-6712d942455e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a67ba197-5b77-4e39-9974-dcaa8c946237', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=012b2bd2-c801-40d7-8aac-bd8116324a2b, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=ce687f2c-b090-4884-959e-0d0e5154ace0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.580 104706 INFO neutron.agent.ovn.metadata.agent [-] Port ce687f2c-b090-4884-959e-0d0e5154ace0 in datapath 0a324d25-aadb-48ba-b761-6712d942455e bound to our chassis#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.582 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a324d25-aadb-48ba-b761-6712d942455e#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.591 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.592 183134 DEBUG nova.virt.disk.api [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.592 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf33c84-c675-4cb5-a5a8-ac11dcba5768]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.593 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.593 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a324d25-a1 in ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.593 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Ensure instance console log exists: /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.594 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.594 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.594 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.595 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a324d25-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.595 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0789ef9e-eb0c-4d27-bada-4d674c3fcc41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 systemd-udevd[222736]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.596 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e0183d8f-1b69-4af4-b3fb-acc82c93c15a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 systemd-machined[154431]: New machine qemu-32-instance-00000025.
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.605 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f23dfc52-56d1-4451-8a25-f861a88e6a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.6089] device (tapce687f2c-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.6096] device (tapce687f2c-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:37:00 np0005601977 systemd[1]: Started Virtual Machine qemu-32-instance-00000025.
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.617 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[58d85d40-72cb-4003-8ba0-844972aca22c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.637 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c5abd7ac-a7b5-4b4f-b0c9-4f07f755ebad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.640 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb4e8be-bda9-4c7c-8fde-0c4a56782bdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.6416] manager: (tap0a324d25-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.665 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5e592517-52e3-4824-88b3-01e2ae2c74b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.668 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[847e0777-1c33-4f38-9ab9-bc74c32527b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.6841] device (tap0a324d25-a0): carrier: link connected
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.688 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2266fb-be65-4e6d-bb7c-53f448451970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.700 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5b95e5c2-9bb6-4d34-a1f3-fb799365ec3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a324d25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:e2:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435202, 'reachable_time': 23169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222769, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.711 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7efcea-8717-4035-9cb1-f5a7cdac332a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:e279'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435202, 'tstamp': 435202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222770, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.725 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[29d1fbc6-f4f5-4adf-9dcb-aef3fed574f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a324d25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:e2:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435202, 'reachable_time': 23169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222771, 'error': None, 'target': 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.747 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a546a7-3dd7-478f-be15-cb209cc9c845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.787 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9a96f1d5-5b7f-4b93-8864-34792d02a5af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.789 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a324d25-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.789 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.789 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a324d25-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 kernel: tap0a324d25-a0: entered promiscuous mode
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.791 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 NetworkManager[55565]: <info>  [1769765820.7933] manager: (tap0a324d25-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.796 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a324d25-a0, col_values=(('external_ids', {'iface-id': '4cd221df-f660-4aff-810f-059c8ba15ad5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.797 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:00Z|00389|binding|INFO|Releasing lport 4cd221df-f660-4aff-810f-059c8ba15ad5 from this chassis (sb_readonly=0)
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.800 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.801 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.802 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3f042d32-451c-450c-a2a8-1f7e3561a20f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.802 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-0a324d25-aadb-48ba-b761-6712d942455e
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/0a324d25-aadb-48ba-b761-6712d942455e.pid.haproxy
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 0a324d25-aadb-48ba-b761-6712d942455e
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:37:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:00.803 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e', 'env', 'PROCESS_TAG=haproxy-0a324d25-aadb-48ba-b761-6712d942455e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a324d25-aadb-48ba-b761-6712d942455e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.855 183134 DEBUG nova.compute.manager [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.855 183134 DEBUG oslo_concurrency.lockutils [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.856 183134 DEBUG oslo_concurrency.lockutils [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.856 183134 DEBUG oslo_concurrency.lockutils [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.856 183134 DEBUG nova.compute.manager [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] No waiting events found dispatching network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.857 183134 WARNING nova.compute.manager [req-13a25b3b-9936-4af6-a26d-bb8e0cf950ab req-31178ab8-a6d9-4af5-b713-5d63a8c84b74 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received unexpected event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 30 04:37:00 np0005601977 nova_compute[183130]: 2026-01-30 09:37:00.928 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:01 np0005601977 podman[222803]: 2026-01-30 09:37:01.16551104 +0000 UTC m=+0.101414035 container create 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 30 04:37:01 np0005601977 podman[222803]: 2026-01-30 09:37:01.080727477 +0000 UTC m=+0.016630452 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:37:01 np0005601977 systemd[1]: Started libpod-conmon-2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567.scope.
Jan 30 04:37:01 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:37:01 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4417a5a110e1398ae80b91013867d77ede21c08f4e36c9c23cd58d451e5d2ded/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:37:01 np0005601977 podman[222803]: 2026-01-30 09:37:01.252525697 +0000 UTC m=+0.188428662 container init 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:37:01 np0005601977 podman[222817]: 2026-01-30 09:37:01.253176206 +0000 UTC m=+0.061426538 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:37:01 np0005601977 podman[222803]: 2026-01-30 09:37:01.257037518 +0000 UTC m=+0.192940473 container start 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:37:01 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [NOTICE]   (222864) : New worker (222866) forked
Jan 30 04:37:01 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [NOTICE]   (222864) : Loading success.
Jan 30 04:37:01 np0005601977 podman[222818]: 2026-01-30 09:37:01.312814921 +0000 UTC m=+0.115494162 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.394 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Removed pending event for e8bcd6f1-636f-4cb1-8133-fa91df48fe59 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.394 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765821.3934815, e8bcd6f1-636f-4cb1-8133-fa91df48fe59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.394 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.397 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Successfully created port: 3b40bc8e-82f2-4093-9eff-d0f741f37a3f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.400 183134 DEBUG nova.compute.manager [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.405 183134 INFO nova.virt.libvirt.driver [-] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance rebooted successfully.#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.405 183134 DEBUG nova.compute.manager [None req-94915706-1291-41d6-a63c-e5398554dcff 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.448 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.451 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.490 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.490 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765821.3950856, e8bcd6f1-636f-4cb1-8133-fa91df48fe59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.490 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] VM Started (Lifecycle Event)#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.520 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:01 np0005601977 nova_compute[183130]: 2026-01-30 09:37:01.523 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.391 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Successfully created port: 904b3416-66b9-4a6d-8b5c-92808f68a476 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.995 183134 DEBUG nova.compute.manager [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.996 183134 DEBUG oslo_concurrency.lockutils [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.996 183134 DEBUG oslo_concurrency.lockutils [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.996 183134 DEBUG oslo_concurrency.lockutils [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.996 183134 DEBUG nova.compute.manager [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] No waiting events found dispatching network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:37:02 np0005601977 nova_compute[183130]: 2026-01-30 09:37:02.997 183134 WARNING nova.compute.manager [req-d59b9bc7-fdbf-4c78-84d1-5bda0a3e5176 req-4cccfe1a-8257-4669-8524-04cd173f0597 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Received unexpected event network-vif-plugged-ce687f2c-b090-4884-959e-0d0e5154ace0 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:37:03 np0005601977 nova_compute[183130]: 2026-01-30 09:37:03.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:03 np0005601977 nova_compute[183130]: 2026-01-30 09:37:03.835 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Successfully updated port: 3b40bc8e-82f2-4093-9eff-d0f741f37a3f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.011 183134 DEBUG nova.compute.manager [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-changed-3b40bc8e-82f2-4093-9eff-d0f741f37a3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.012 183134 DEBUG nova.compute.manager [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing instance network info cache due to event network-changed-3b40bc8e-82f2-4093-9eff-d0f741f37a3f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.012 183134 DEBUG oslo_concurrency.lockutils [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.012 183134 DEBUG oslo_concurrency.lockutils [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.012 183134 DEBUG nova.network.neutron [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing network info cache for port 3b40bc8e-82f2-4093-9eff-d0f741f37a3f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.236 183134 DEBUG nova.network.neutron [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.251 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Successfully updated port: 904b3416-66b9-4a6d-8b5c-92808f68a476 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.330 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.383 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.433 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.434 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.434 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.434 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.535 183134 DEBUG nova.network.neutron [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.550 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.600 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.620 183134 DEBUG oslo_concurrency.lockutils [req-d7d8cb23-b0c4-4b53-9fa1-657f6ca826fb req-10b65f85-d40a-4ffc-86c2-46b0b2b985c2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.621 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.622 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:37:05 np0005601977 podman[222887]: 2026-01-30 09:37:05.622915089 +0000 UTC m=+0.143430520 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.675 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.676 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.729 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bae9749f-c9d5-45d2-978f-c3f5a0451b9d/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.736 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.788 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.789 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.838 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8bcd6f1-636f-4cb1-8133-fa91df48fe59/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.891 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:37:05 np0005601977 nova_compute[183130]: 2026-01-30 09:37:05.929 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.019 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.021 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5339MB free_disk=73.19178771972656GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.021 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.021 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.250 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance bae9749f-c9d5-45d2-978f-c3f5a0451b9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.251 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance e8bcd6f1-636f-4cb1-8133-fa91df48fe59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.251 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 369ca743-1956-43ca-9978-4385a5862de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.252 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.253 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.317 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.340 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.377 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:37:06 np0005601977 nova_compute[183130]: 2026-01-30 09:37:06.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:07 np0005601977 nova_compute[183130]: 2026-01-30 09:37:07.125 183134 DEBUG nova.compute.manager [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-changed-904b3416-66b9-4a6d-8b5c-92808f68a476 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:07 np0005601977 nova_compute[183130]: 2026-01-30 09:37:07.126 183134 DEBUG nova.compute.manager [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing instance network info cache due to event network-changed-904b3416-66b9-4a6d-8b5c-92808f68a476. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:37:07 np0005601977 nova_compute[183130]: 2026-01-30 09:37:07.126 183134 DEBUG oslo_concurrency.lockutils [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:37:08 np0005601977 nova_compute[183130]: 2026-01-30 09:37:08.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:08 np0005601977 nova_compute[183130]: 2026-01-30 09:37:08.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.034 183134 DEBUG nova.network.neutron [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updating instance_info_cache with network_info: [{"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.068 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.068 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Instance network_info: |[{"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.069 183134 DEBUG oslo_concurrency.lockutils [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.069 183134 DEBUG nova.network.neutron [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing network info cache for port 904b3416-66b9-4a6d-8b5c-92808f68a476 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.072 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Start _get_guest_xml network_info=[{"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.076 183134 WARNING nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.080 183134 DEBUG nova.virt.libvirt.host [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.080 183134 DEBUG nova.virt.libvirt.host [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.088 183134 DEBUG nova.virt.libvirt.host [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.089 183134 DEBUG nova.virt.libvirt.host [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.090 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.090 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.090 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.090 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.091 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.091 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.091 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.091 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.092 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.092 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.092 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.093 183134 DEBUG nova.virt.hardware [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.096 183134 DEBUG nova.virt.libvirt.vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:36:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-564993589',display_name='tempest-TestGettingAddress-server-564993589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-564993589',id=38,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-0un81mx9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:37:00Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=369ca743-1956-43ca-9978-4385a5862de5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.096 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.097 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:50:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b40bc8e-82f2-4093-9eff-d0f741f37a3f,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b40bc8e-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.098 183134 DEBUG nova.virt.libvirt.vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:36:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-564993589',display_name='tempest-TestGettingAddress-server-564993589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-564993589',id=38,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-0un81mx9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:37:00Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=369ca743-1956-43ca-9978-4385a5862de5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.098 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.099 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5e:2a,bridge_name='br-int',has_traffic_filtering=True,id=904b3416-66b9-4a6d-8b5c-92808f68a476,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904b3416-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.101 183134 DEBUG nova.objects.instance [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 369ca743-1956-43ca-9978-4385a5862de5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.138 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <uuid>369ca743-1956-43ca-9978-4385a5862de5</uuid>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <name>instance-00000026</name>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-564993589</nova:name>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:37:10</nova:creationTime>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:port uuid="3b40bc8e-82f2-4093-9eff-d0f741f37a3f">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        <nova:port uuid="904b3416-66b9-4a6d-8b5c-92808f68a476">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe2a:5e2a" ipVersion="6"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe2a:5e2a" ipVersion="6"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="serial">369ca743-1956-43ca-9978-4385a5862de5</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="uuid">369ca743-1956-43ca-9978-4385a5862de5</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.config"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:8c:50:9c"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <target dev="tap3b40bc8e-82"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:2a:5e:2a"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <target dev="tap904b3416-66"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/console.log" append="off"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:37:10 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:37:10 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:37:10 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:37:10 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.143 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Preparing to wait for external event network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.144 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.144 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.145 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.145 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Preparing to wait for external event network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.145 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.146 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.146 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.146 183134 DEBUG nova.virt.libvirt.vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:36:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-564993589',display_name='tempest-TestGettingAddress-server-564993589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-564993589',id=38,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-0un81mx9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:37:00Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=369ca743-1956-43ca-9978-4385a5862de5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.147 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.148 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:50:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b40bc8e-82f2-4093-9eff-d0f741f37a3f,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b40bc8e-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.148 183134 DEBUG os_vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:50:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b40bc8e-82f2-4093-9eff-d0f741f37a3f,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b40bc8e-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.149 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.149 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.149 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.152 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.153 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b40bc8e-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.153 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b40bc8e-82, col_values=(('external_ids', {'iface-id': '3b40bc8e-82f2-4093-9eff-d0f741f37a3f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:50:9c', 'vm-uuid': '369ca743-1956-43ca-9978-4385a5862de5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.155 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 NetworkManager[55565]: <info>  [1769765830.1568] manager: (tap3b40bc8e-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.158 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.163 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.164 183134 INFO os_vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:50:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b40bc8e-82f2-4093-9eff-d0f741f37a3f,network=Network(b2edea48-b03a-4c39-b516-89355e7acf87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b40bc8e-82')#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.165 183134 DEBUG nova.virt.libvirt.vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-30T09:36:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-564993589',display_name='tempest-TestGettingAddress-server-564993589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-564993589',id=38,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSDZqdxIGlZNEJRlDGnQhKCuzoV/zw59hDiHtkl4pC4mXVWVvAMVle65jDzP87oMWbcr++CJZVp4CnDD1BzgDHehqCAA1sL6BbMmHdUhCnuatgek9QO/G3Yu0BK7tkg3g==',key_name='tempest-TestGettingAddress-2033059735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-0un81mx9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:37:00Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=369ca743-1956-43ca-9978-4385a5862de5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.166 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.167 183134 DEBUG nova.network.os_vif_util [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5e:2a,bridge_name='br-int',has_traffic_filtering=True,id=904b3416-66b9-4a6d-8b5c-92808f68a476,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904b3416-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.168 183134 DEBUG os_vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5e:2a,bridge_name='br-int',has_traffic_filtering=True,id=904b3416-66b9-4a6d-8b5c-92808f68a476,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904b3416-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.168 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.169 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.169 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.171 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.171 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap904b3416-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.171 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap904b3416-66, col_values=(('external_ids', {'iface-id': '904b3416-66b9-4a6d-8b5c-92808f68a476', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:5e:2a', 'vm-uuid': '369ca743-1956-43ca-9978-4385a5862de5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.213 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 NetworkManager[55565]: <info>  [1769765830.2153] manager: (tap904b3416-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.216 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.220 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.221 183134 INFO os_vif [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5e:2a,bridge_name='br-int',has_traffic_filtering=True,id=904b3416-66b9-4a6d-8b5c-92808f68a476,network=Network(f2b07532-97d0-4974-827c-4709f0bf52f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904b3416-66')#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.344 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.346 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.346 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:8c:50:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.347 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:2a:5e:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.348 183134 INFO nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Using config drive#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.377 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:10 np0005601977 nova_compute[183130]: 2026-01-30 09:37:10.933 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.200 183134 INFO nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Creating config drive at /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.config#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.208 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfevpea8r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.329 183134 DEBUG oslo_concurrency.processutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/369ca743-1956-43ca-9978-4385a5862de5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfevpea8r" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:37:11 np0005601977 kernel: tap3b40bc8e-82: entered promiscuous mode
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.3769] manager: (tap3b40bc8e-82): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.379 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00390|binding|INFO|Claiming lport 3b40bc8e-82f2-4093-9eff-d0f741f37a3f for this chassis.
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00391|binding|INFO|3b40bc8e-82f2-4093-9eff-d0f741f37a3f: Claiming fa:16:3e:8c:50:9c 10.100.0.4
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.384 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00392|binding|INFO|Setting lport 3b40bc8e-82f2-4093-9eff-d0f741f37a3f ovn-installed in OVS
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.390 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00393|binding|INFO|Setting lport 3b40bc8e-82f2-4093-9eff-d0f741f37a3f up in Southbound
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.395 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:50:9c 10.100.0.4'], port_security=['fa:16:3e:8c:50:9c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '369ca743-1956-43ca-9978-4385a5862de5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2edea48-b03a-4c39-b516-89355e7acf87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd1977375-373b-46bc-9d23-918fa4e3324a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ae2ce11-04ba-4a06-91ff-cbcd0cf4d441, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=3b40bc8e-82f2-4093-9eff-d0f741f37a3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.397 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 3b40bc8e-82f2-4093-9eff-d0f741f37a3f in datapath b2edea48-b03a-4c39-b516-89355e7acf87 bound to our chassis#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.399 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2edea48-b03a-4c39-b516-89355e7acf87#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.404 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.4075] manager: (tap904b3416-66): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Jan 30 04:37:11 np0005601977 systemd-udevd[222951]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:37:11 np0005601977 systemd-udevd[222952]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:37:11 np0005601977 kernel: tap904b3416-66: entered promiscuous mode
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.413 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00394|if_status|INFO|Dropped 1 log messages in last 49 seconds (most recently, 49 seconds ago) due to excessive rate
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00395|if_status|INFO|Not updating pb chassis for 904b3416-66b9-4a6d-8b5c-92808f68a476 now as sb is readonly
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00396|binding|INFO|Claiming lport 904b3416-66b9-4a6d-8b5c-92808f68a476 for this chassis.
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00397|binding|INFO|904b3416-66b9-4a6d-8b5c-92808f68a476: Claiming fa:16:3e:2a:5e:2a 2001:db8:0:1:f816:3eff:fe2a:5e2a 2001:db8::f816:3eff:fe2a:5e2a
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.414 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8b07f884-a5de-49d3-87e4-bb14d8c25bc1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.4180] device (tap904b3416-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.4184] device (tap904b3416-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.4192] device (tap3b40bc8e-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:37:11 np0005601977 NetworkManager[55565]: <info>  [1769765831.4196] device (tap3b40bc8e-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.425 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00398|binding|INFO|Setting lport 904b3416-66b9-4a6d-8b5c-92808f68a476 ovn-installed in OVS
Jan 30 04:37:11 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:11Z|00399|binding|INFO|Setting lport 904b3416-66b9-4a6d-8b5c-92808f68a476 up in Southbound
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.427 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.427 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:5e:2a 2001:db8:0:1:f816:3eff:fe2a:5e2a 2001:db8::f816:3eff:fe2a:5e2a'], port_security=['fa:16:3e:2a:5e:2a 2001:db8:0:1:f816:3eff:fe2a:5e2a 2001:db8::f816:3eff:fe2a:5e2a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe2a:5e2a/64 2001:db8::f816:3eff:fe2a:5e2a/64', 'neutron:device_id': '369ca743-1956-43ca-9978-4385a5862de5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd1977375-373b-46bc-9d23-918fa4e3324a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e38aee4e-ba47-49c3-9bdf-bed97e27acef, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=904b3416-66b9-4a6d-8b5c-92808f68a476) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.433 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a9677f99-749e-4c2b-86ba-8dad53599814]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 systemd-machined[154431]: New machine qemu-33-instance-00000026.
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.436 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6e922c-8a60-4a79-8dc0-bd22d542af12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 systemd[1]: Started Virtual Machine qemu-33-instance-00000026.
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.451 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f6e35d-d369-42db-8a73-19adda6faca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.468 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a766298d-2525-4f47-8f8d-d1cf5c23375f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2edea48-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:be:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431386, 'reachable_time': 23505, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222963, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.479 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fca71f2d-be60-4b8a-a7b7-5e36e8b36e4a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb2edea48-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431394, 'tstamp': 431394}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222964, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb2edea48-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431396, 'tstamp': 431396}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222964, 'error': None, 'target': 'ovnmeta-b2edea48-b03a-4c39-b516-89355e7acf87', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.482 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2edea48-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.484 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.485 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.485 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2edea48-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.486 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.486 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2edea48-b0, col_values=(('external_ids', {'iface-id': '75d877f2-b388-4f11-9237-a14c4feee2ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.486 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.488 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 904b3416-66b9-4a6d-8b5c-92808f68a476 in datapath f2b07532-97d0-4974-827c-4709f0bf52f6 unbound from our chassis#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.489 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2b07532-97d0-4974-827c-4709f0bf52f6#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.499 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[72322380-fcab-4553-961c-fb36a8001db8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.519 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5656b769-f361-40ca-9202-52fd289cd43f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.522 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[83873449-ad4a-4167-b151-90ded6e581f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.539 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[592479d1-a3b8-4aa6-8bda-1ef79efe3c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.552 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b090c55f-a1fd-4468-8ef8-e4f1bd600b9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2b07532-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:b4:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 22, 'tx_packets': 4, 'rx_bytes': 1936, 'tx_bytes': 312, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 22, 'tx_packets': 4, 'rx_bytes': 1936, 'tx_bytes': 312, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431455, 'reachable_time': 38384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 22, 'inoctets': 1628, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 22, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1628, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 22, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222976, 'error': None, 'target': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.564 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b2aad212-ebb9-4d25-9839-3d85aaf61b33]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf2b07532-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431465, 'tstamp': 431465}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222977, 'error': None, 'target': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.565 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2b07532-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.567 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 nova_compute[183130]: 2026-01-30 09:37:11.568 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.570 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2b07532-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.570 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.570 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2b07532-90, col_values=(('external_ids', {'iface-id': '79c5a8be-b732-4d5f-86e3-0f3d570c8b43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:11.571 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.221 183134 DEBUG nova.compute.manager [req-390c8b9b-f672-4101-abc8-e76eacd710d9 req-4399e9c8-6383-4b14-8d02-768c679c3ba4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.223 183134 DEBUG oslo_concurrency.lockutils [req-390c8b9b-f672-4101-abc8-e76eacd710d9 req-4399e9c8-6383-4b14-8d02-768c679c3ba4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.224 183134 DEBUG oslo_concurrency.lockutils [req-390c8b9b-f672-4101-abc8-e76eacd710d9 req-4399e9c8-6383-4b14-8d02-768c679c3ba4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.224 183134 DEBUG oslo_concurrency.lockutils [req-390c8b9b-f672-4101-abc8-e76eacd710d9 req-4399e9c8-6383-4b14-8d02-768c679c3ba4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.225 183134 DEBUG nova.compute.manager [req-390c8b9b-f672-4101-abc8-e76eacd710d9 req-4399e9c8-6383-4b14-8d02-768c679c3ba4 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Processing event network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.345 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.402 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765832.4018047, 369ca743-1956-43ca-9978-4385a5862de5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.403 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] VM Started (Lifecycle Event)#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.431 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.436 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765832.4021125, 369ca743-1956-43ca-9978-4385a5862de5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.437 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.462 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.466 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:37:12 np0005601977 nova_compute[183130]: 2026-01-30 09:37:12.491 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:37:13 np0005601977 nova_compute[183130]: 2026-01-30 09:37:13.045 183134 DEBUG nova.network.neutron [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updated VIF entry in instance network info cache for port 904b3416-66b9-4a6d-8b5c-92808f68a476. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:37:13 np0005601977 nova_compute[183130]: 2026-01-30 09:37:13.046 183134 DEBUG nova.network.neutron [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updating instance_info_cache with network_info: [{"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:37:13 np0005601977 nova_compute[183130]: 2026-01-30 09:37:13.086 183134 DEBUG oslo_concurrency.lockutils [req-244879a5-3a47-4b09-ae33-809632dfedec req-81f9470b-ae90-4b1f-99ca-c80d85d90aa3 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:37:13 np0005601977 nova_compute[183130]: 2026-01-30 09:37:13.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:13 np0005601977 nova_compute[183130]: 2026-01-30 09:37:13.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:14 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:14Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4d:a0:0d 10.100.0.14
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.307 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.307 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.308 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.309 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.310 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] No event matching network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f in dict_keys([('network-vif-plugged', '904b3416-66b9-4a6d-8b5c-92808f68a476')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.310 183134 WARNING nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received unexpected event network-vif-plugged-3b40bc8e-82f2-4093-9eff-d0f741f37a3f for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.311 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.312 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.312 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.313 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.314 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Processing event network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.314 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.315 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "369ca743-1956-43ca-9978-4385a5862de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.315 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.316 183134 DEBUG oslo_concurrency.lockutils [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.316 183134 DEBUG nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] No waiting events found dispatching network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.317 183134 WARNING nova.compute.manager [req-f9cf749e-6831-48b2-b77c-aaee8173db67 req-ca948a8b-d282-46a6-b409-abdb7765e043 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received unexpected event network-vif-plugged-904b3416-66b9-4a6d-8b5c-92808f68a476 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.318 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.323 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765834.322766, 369ca743-1956-43ca-9978-4385a5862de5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.323 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.327 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.331 183134 INFO nova.virt.libvirt.driver [-] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Instance spawned successfully.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.332 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.340 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.349 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.356 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.361 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.362 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.362 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.363 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.363 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.364 183134 DEBUG nova.virt.libvirt.driver [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.394 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.394 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 369ca743-1956-43ca-9978-4385a5862de5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.395 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.395 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.396 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.396 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.448 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.492 183134 INFO nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Took 14.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.492 183134 DEBUG nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.560 183134 INFO nova.compute.manager [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Took 14.78 seconds to build instance.#033[00m
Jan 30 04:37:14 np0005601977 nova_compute[183130]: 2026-01-30 09:37:14.583 183134 DEBUG oslo_concurrency.lockutils [None req-a174ded9-bda9-4205-9f1b-890ce2739e2b 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "369ca743-1956-43ca-9978-4385a5862de5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.213 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.397 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.397 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.398 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.618 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.618 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.619 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.619 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bae9749f-c9d5-45d2-978f-c3f5a0451b9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:15 np0005601977 podman[222996]: 2026-01-30 09:37:15.848044512 +0000 UTC m=+0.062884710 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:37:15 np0005601977 nova_compute[183130]: 2026-01-30 09:37:15.936 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.710 183134 DEBUG nova.compute.manager [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Received event network-changed-3b40bc8e-82f2-4093-9eff-d0f741f37a3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.711 183134 DEBUG nova.compute.manager [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing instance network info cache due to event network-changed-3b40bc8e-82f2-4093-9eff-d0f741f37a3f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.711 183134 DEBUG oslo_concurrency.lockutils [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.711 183134 DEBUG oslo_concurrency.lockutils [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-369ca743-1956-43ca-9978-4385a5862de5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.711 183134 DEBUG nova.network.neutron [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Refreshing network info cache for port 3b40bc8e-82f2-4093-9eff-d0f741f37a3f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.802 183134 INFO nova.compute.manager [None req-a00a449a-8b31-4baf-a4c3-22d2559a345b 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Get console output#033[00m
Jan 30 04:37:18 np0005601977 nova_compute[183130]: 2026-01-30 09:37:18.807 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.394 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updating instance_info_cache with network_info: [{"id": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "address": "fa:16:3e:c2:45:bb", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b60f325-bf", "ovs_interfaceid": "9b60f325-bf20-4165-a9bc-76eed7a0ebd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "address": "fa:16:3e:00:14:d8", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe00:14d8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968380f-68", "ovs_interfaceid": "8968380f-68a0-46fe-aa6d-4ad70b0ce1e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.415 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-bae9749f-c9d5-45d2-978f-c3f5a0451b9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.415 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: bae9749f-c9d5-45d2-978f-c3f5a0451b9d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.977 183134 DEBUG oslo_concurrency.lockutils [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.978 183134 DEBUG oslo_concurrency.lockutils [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.978 183134 DEBUG oslo_concurrency.lockutils [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.978 183134 DEBUG oslo_concurrency.lockutils [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.979 183134 DEBUG oslo_concurrency.lockutils [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "e8bcd6f1-636f-4cb1-8133-fa91df48fe59-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.981 183134 INFO nova.compute.manager [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Terminating instance#033[00m
Jan 30 04:37:19 np0005601977 nova_compute[183130]: 2026-01-30 09:37:19.982 183134 DEBUG nova.compute.manager [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:37:20 np0005601977 kernel: tapce687f2c-b0 (unregistering): left promiscuous mode
Jan 30 04:37:20 np0005601977 NetworkManager[55565]: <info>  [1769765840.0067] device (tapce687f2c-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:37:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:20Z|00400|binding|INFO|Releasing lport ce687f2c-b090-4884-959e-0d0e5154ace0 from this chassis (sb_readonly=0)
Jan 30 04:37:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:20Z|00401|binding|INFO|Setting lport ce687f2c-b090-4884-959e-0d0e5154ace0 down in Southbound
Jan 30 04:37:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:37:20Z|00402|binding|INFO|Removing iface tapce687f2c-b0 ovn-installed in OVS
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.016 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.019 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.021 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.022 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:a0:0d 10.100.0.14'], port_security=['fa:16:3e:4d:a0:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8bcd6f1-636f-4cb1-8133-fa91df48fe59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a324d25-aadb-48ba-b761-6712d942455e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a67ba197-5b77-4e39-9974-dcaa8c946237', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=012b2bd2-c801-40d7-8aac-bd8116324a2b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=ce687f2c-b090-4884-959e-0d0e5154ace0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.023 104706 INFO neutron.agent.ovn.metadata.agent [-] Port ce687f2c-b090-4884-959e-0d0e5154ace0 in datapath 0a324d25-aadb-48ba-b761-6712d942455e unbound from our chassis#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.024 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a324d25-aadb-48ba-b761-6712d942455e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.025 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b0042771-3a95-456b-90c1-6bb33c2b0f82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.026 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e namespace which is not needed anymore#033[00m
Jan 30 04:37:20 np0005601977 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 30 04:37:20 np0005601977 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000025.scope: Consumed 12.731s CPU time.
Jan 30 04:37:20 np0005601977 systemd-machined[154431]: Machine qemu-32-instance-00000025 terminated.
Jan 30 04:37:20 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [NOTICE]   (222864) : haproxy version is 2.8.14-c23fe91
Jan 30 04:37:20 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [NOTICE]   (222864) : path to executable is /usr/sbin/haproxy
Jan 30 04:37:20 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [WARNING]  (222864) : Exiting Master process...
Jan 30 04:37:20 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [ALERT]    (222864) : Current worker (222866) exited with code 143 (Terminated)
Jan 30 04:37:20 np0005601977 neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e[222843]: [WARNING]  (222864) : All workers exited. Exiting... (0)
Jan 30 04:37:20 np0005601977 systemd[1]: libpod-2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567.scope: Deactivated successfully.
Jan 30 04:37:20 np0005601977 podman[223041]: 2026-01-30 09:37:20.140304714 +0000 UTC m=+0.047888786 container died 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:37:20 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567-userdata-shm.mount: Deactivated successfully.
Jan 30 04:37:20 np0005601977 systemd[1]: var-lib-containers-storage-overlay-4417a5a110e1398ae80b91013867d77ede21c08f4e36c9c23cd58d451e5d2ded-merged.mount: Deactivated successfully.
Jan 30 04:37:20 np0005601977 podman[223041]: 2026-01-30 09:37:20.174703779 +0000 UTC m=+0.082287851 container cleanup 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:37:20 np0005601977 systemd[1]: libpod-conmon-2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567.scope: Deactivated successfully.
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.216 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 podman[223071]: 2026-01-30 09:37:20.227010662 +0000 UTC m=+0.039242446 container remove 2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.232 183134 INFO nova.virt.libvirt.driver [-] [instance: e8bcd6f1-636f-4cb1-8133-fa91df48fe59] Instance destroyed successfully.#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.233 183134 DEBUG nova.objects.instance [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid e8bcd6f1-636f-4cb1-8133-fa91df48fe59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.236 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7efb45e0-a7ad-4bb0-9628-52819323954d]: (4, ('Fri Jan 30 09:37:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e (2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567)\n2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567\nFri Jan 30 09:37:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a324d25-aadb-48ba-b761-6712d942455e (2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567)\n2491b094b679bef5255bbd9bf53502727ea61b65a7c7515125997fbb86df5567\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.239 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2eaf018c-7b54-44f3-bbfd-97967453bbcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:37:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:37:20.240 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a324d25-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.247 183134 DEBUG nova.virt.libvirt.vif [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:36:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1216546466',display_name='tempest-TestNetworkAdvancedServerOps-server-1216546466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1216546466',id=37,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF4xzDdvAT3TB6+pS58kA/Q6TjKvRHtp/rfQ8kRk+luyL76Y6GS0NpLkZhqSFAsNbiAU1WKUenXI/Y9pKszXstyjqDm111TlZMUsrXUlqK8gspewyPLq0RjqkSmEPiU09g==',key_name='tempest-TestNetworkAdvancedServerOps-198083341',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-8jcs92kv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:37:01Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=e8bcd6f1-636f-4cb1-8133-fa91df48fe59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.248 183134 DEBUG nova.network.os_vif_util [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "ce687f2c-b090-4884-959e-0d0e5154ace0", "address": "fa:16:3e:4d:a0:0d", "network": {"id": "0a324d25-aadb-48ba-b761-6712d942455e", "bridge": "br-int", "label": "tempest-network-smoke--495393347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce687f2c-b0", "ovs_interfaceid": "ce687f2c-b090-4884-959e-0d0e5154ace0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.248 183134 DEBUG nova.network.os_vif_util [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.248 183134 DEBUG os_vif [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.250 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.250 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce687f2c-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.290 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 kernel: tap0a324d25-a0: left promiscuous mode
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.294 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.295 183134 DEBUG nova.network.neutron [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updated VIF entry in instance network info cache for port 3b40bc8e-82f2-4093-9eff-d0f741f37a3f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.296 183134 DEBUG nova.network.neutron [req-12541a82-2e7e-43e4-83c0-481097b9a56c req-ebe3b8df-03de-450f-aea8-dcdac5d9236e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 369ca743-1956-43ca-9978-4385a5862de5] Updating instance_info_cache with network_info: [{"id": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "address": "fa:16:3e:8c:50:9c", "network": {"id": "b2edea48-b03a-4c39-b516-89355e7acf87", "bridge": "br-int", "label": "tempest-network-smoke--1217622854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b40bc8e-82", "ovs_interfaceid": "3b40bc8e-82f2-4093-9eff-d0f741f37a3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "904b3416-66b9-4a6d-8b5c-92808f68a476", "address": "fa:16:3e:2a:5e:2a", "network": {"id": "f2b07532-97d0-4974-827c-4709f0bf52f6", "bridge": "br-int", "label": "tempest-network-smoke--844603973", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe2a:5e2a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap904b3416-66", "ovs_interfaceid": "904b3416-66b9-4a6d-8b5c-92808f68a476", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.298 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:37:20 np0005601977 nova_compute[183130]: 2026-01-30 09:37:20.301 183134 INFO os_vif [None req-2ca9d21c-6f50-4661-8ec5-51580d7a93e0 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:a0:0d,bridge_name='br-int',has_traffic_filtering=True,id=ce687f2c-b090-4884-959e-0d0e5154ace0,network=Network(0a324d25-aadb-48ba-b761-6712d942455e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce687f2c-b0')#033[00m
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.811 183134 DEBUG nova.objects.instance [None req-6637df39-9595-48d3-a3f9-89dae3dca9d6 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:38:16 np0005601977 podman[223755]: 2026-01-30 09:38:16.830131576 +0000 UTC m=+0.043601522 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.842 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765896.841728, 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.842 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.861 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.865 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:38:16 np0005601977 nova_compute[183130]: 2026-01-30 09:38:16.887 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 30 04:38:17 np0005601977 rsyslogd[1006]: imjournal: 832 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.399 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:38:17 np0005601977 kernel: tap13cb8e9e-bf (unregistering): left promiscuous mode
Jan 30 04:38:17 np0005601977 NetworkManager[55565]: <info>  [1769765897.4884] device (tap13cb8e9e-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.539 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:17Z|00426|binding|INFO|Releasing lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 from this chassis (sb_readonly=0)
Jan 30 04:38:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:17Z|00427|binding|INFO|Setting lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 down in Southbound
Jan 30 04:38:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:17Z|00428|binding|INFO|Removing iface tap13cb8e9e-bf ovn-installed in OVS
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.544 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.548 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:17 np0005601977 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 30 04:38:17 np0005601977 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000028.scope: Consumed 12.441s CPU time.
Jan 30 04:38:17 np0005601977 systemd-machined[154431]: Machine qemu-34-instance-00000028 terminated.
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.579 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:41:24 10.100.0.4'], port_security=['fa:16:3e:94:41:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f72b9cb-fbf3-42bf-b6c0-54640eec3466', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '10dbf1a5-239e-4069-96b7-a4012c9cf2fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df6a84d2-e9f1-4587-9db0-e39b2b457cb8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=13cb8e9e-bf8d-41e1-bc08-decc34875371) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.581 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 13cb8e9e-bf8d-41e1-bc08-decc34875371 in datapath b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd unbound from our chassis#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.583 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.584 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[88085bc4-60f2-4934-8406-2fb3f07fe018]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.585 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd namespace which is not needed anymore#033[00m
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [NOTICE]   (223652) : haproxy version is 2.8.14-c23fe91
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [NOTICE]   (223652) : path to executable is /usr/sbin/haproxy
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [WARNING]  (223652) : Exiting Master process...
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [WARNING]  (223652) : Exiting Master process...
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [ALERT]    (223652) : Current worker (223654) exited with code 143 (Terminated)
Jan 30 04:38:17 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223648]: [WARNING]  (223652) : All workers exited. Exiting... (0)
Jan 30 04:38:17 np0005601977 systemd[1]: libpod-d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64.scope: Deactivated successfully.
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.717 183134 DEBUG nova.compute.manager [None req-6637df39-9595-48d3-a3f9-89dae3dca9d6 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:17 np0005601977 podman[223806]: 2026-01-30 09:38:17.718830574 +0000 UTC m=+0.039811924 container died d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:38:17 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64-userdata-shm.mount: Deactivated successfully.
Jan 30 04:38:17 np0005601977 systemd[1]: var-lib-containers-storage-overlay-f6aa36d0fe4eb5b909c96dddb1a7679fe8aecc5bec6ba873b568df8fb304cf27-merged.mount: Deactivated successfully.
Jan 30 04:38:17 np0005601977 podman[223806]: 2026-01-30 09:38:17.753031875 +0000 UTC m=+0.074013225 container cleanup d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 30 04:38:17 np0005601977 systemd[1]: libpod-conmon-d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64.scope: Deactivated successfully.
Jan 30 04:38:17 np0005601977 podman[223849]: 2026-01-30 09:38:17.808083884 +0000 UTC m=+0.038043782 container remove d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.812 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[13633dcd-e820-4b61-b707-1d9640a365ec]: (4, ('Fri Jan 30 09:38:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd (d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64)\nd6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64\nFri Jan 30 09:38:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd (d6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64)\nd6c45e8b88b0944a8ca16a8a0b847ce46d6b12cf9b715e2b2a24aa299d0e6c64\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.815 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[abdec081-4b2d-4183-b461-e0f6dbf6d864]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.816 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ed0c10-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:17 np0005601977 kernel: tapb8ed0c10-f0: left promiscuous mode
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.819 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:17 np0005601977 nova_compute[183130]: 2026-01-30 09:38:17.830 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.833 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0c37a374-b535-4fbb-a57b-4cc8a534f96b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.857 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d8ddfd-05c2-41a0-b8f3-7865cd2a5fc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.859 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[68e3918e-b752-4040-b042-a7f26b7254fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.875 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[eb68f84c-9db7-4e0f-ae12-2cdf851d5f5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440799, 'reachable_time': 44244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223868, 'error': None, 'target': 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.878 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:38:17 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:17.879 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5b5281-57d7-45d0-82bd-7361ca5e65de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:17 np0005601977 systemd[1]: run-netns-ovnmeta\x2db8ed0c10\x2dfb1a\x2d495f\x2da2dc\x2dfb5a165d08fd.mount: Deactivated successfully.
Jan 30 04:38:18 np0005601977 nova_compute[183130]: 2026-01-30 09:38:18.409 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.116 183134 DEBUG nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.117 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.117 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.117 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.117 183134 DEBUG nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.117 183134 WARNING nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received unexpected event network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with vm_state suspended and task_state None.#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 DEBUG nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 DEBUG oslo_concurrency.lockutils [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 DEBUG nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.118 183134 WARNING nova.compute.manager [req-9aecb037-2116-4e1a-a049-7d30d757eac1 req-e4ff29ab-909f-4501-901c-5a4be1876b98 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received unexpected event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with vm_state suspended and task_state None.#033[00m
Jan 30 04:38:20 np0005601977 nova_compute[183130]: 2026-01-30 09:38:20.758 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.071 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.090 183134 INFO nova.compute.manager [None req-169c489e-421b-439b-a19a-55b878b8c9b6 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Get console output#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.323 183134 INFO nova.compute.manager [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Resuming#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.324 183134 DEBUG nova.objects.instance [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'flavor' on Instance uuid 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.662 183134 DEBUG oslo_concurrency.lockutils [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.663 183134 DEBUG oslo_concurrency.lockutils [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquired lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:38:21 np0005601977 nova_compute[183130]: 2026-01-30 09:38:21.663 183134 DEBUG nova.network.neutron [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.310 183134 DEBUG nova.network.neutron [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Updating instance_info_cache with network_info: [{"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.408 183134 DEBUG oslo_concurrency.lockutils [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Releasing lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.416 183134 DEBUG nova.virt.libvirt.vif [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:37:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-460321422',display_name='tempest-TestNetworkAdvancedServerOps-server-460321422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-460321422',id=40,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGxLYkNGOqQwlLHoN/jF3P1E0fyGbR8h2qexDDjLzmClyLd602v6Tys94o/gxnd87uORMB/wQUYmXxB5ay6W3ZYlQW0+9mEoEPhdU9ZjFEq00wOsbR3dZVV9qeRRZCOrw==',key_name='tempest-TestNetworkAdvancedServerOps-1189635368',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:37:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-nhk7d60u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:38:17Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=3f72b9cb-fbf3-42bf-b6c0-54640eec3466,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.416 183134 DEBUG nova.network.os_vif_util [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.418 183134 DEBUG nova.network.os_vif_util [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.418 183134 DEBUG os_vif [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.419 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.420 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.420 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.423 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.424 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13cb8e9e-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.424 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13cb8e9e-bf, col_values=(('external_ids', {'iface-id': '13cb8e9e-bf8d-41e1-bc08-decc34875371', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:41:24', 'vm-uuid': '3f72b9cb-fbf3-42bf-b6c0-54640eec3466'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.425 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.426 183134 INFO os_vif [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf')#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.449 183134 DEBUG nova.objects.instance [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'numa_topology' on Instance uuid 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:38:23 np0005601977 kernel: tap13cb8e9e-bf: entered promiscuous mode
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.5361] manager: (tap13cb8e9e-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.536 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:23Z|00429|binding|INFO|Claiming lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 for this chassis.
Jan 30 04:38:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:23Z|00430|binding|INFO|13cb8e9e-bf8d-41e1-bc08-decc34875371: Claiming fa:16:3e:94:41:24 10.100.0.4
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.543 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:23Z|00431|binding|INFO|Setting lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 ovn-installed in OVS
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.544 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.546 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:23Z|00432|binding|INFO|Setting lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 up in Southbound
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.550 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:41:24 10.100.0.4'], port_security=['fa:16:3e:94:41:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f72b9cb-fbf3-42bf-b6c0-54640eec3466', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '10dbf1a5-239e-4069-96b7-a4012c9cf2fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df6a84d2-e9f1-4587-9db0-e39b2b457cb8, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=13cb8e9e-bf8d-41e1-bc08-decc34875371) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.551 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 13cb8e9e-bf8d-41e1-bc08-decc34875371 in datapath b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd bound to our chassis#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.553 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.565 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc955ff-5883-431d-b01d-f482727392a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.566 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8ed0c10-f1 in ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:38:23 np0005601977 systemd-machined[154431]: New machine qemu-35-instance-00000028.
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.568 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8ed0c10-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.569 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fb3d37-cdae-4328-82cf-dc128faf255b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.569 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[32d55c47-976d-401a-bc9b-8cead127a20e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.581 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[08b04521-9533-4e63-b780-eac683788ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 systemd[1]: Started Virtual Machine qemu-35-instance-00000028.
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.596 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4e094b55-6964-4df7-a0c1-72b5c87f0a78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 systemd-udevd[223888]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.6177] device (tap13cb8e9e-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.6189] device (tap13cb8e9e-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.627 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[521b3301-4603-41c3-80fa-62702fae4e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.632 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f132a37a-de7d-4d7d-b594-16d1a57af011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 systemd-udevd[223892]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.6343] manager: (tapb8ed0c10-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.661 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f635161b-20d1-4863-a75c-417b458c2e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.664 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2eac5bdb-57ae-4c30-afb8-ffb977892502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.6844] device (tapb8ed0c10-f0): carrier: link connected
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.689 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d20263-72df-488a-ae86-d0ded8295282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.701 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d6deeb-0f0d-437b-b4d9-bb608c098fe7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ed0c10-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:00:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443502, 'reachable_time': 36165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223918, 'error': None, 'target': 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.713 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae1dd95-4af9-4cab-8a88-c726d96d1353]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:25'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443502, 'tstamp': 443502}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223919, 'error': None, 'target': 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.729 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[46dc9fb6-aa65-460c-9ef1-aff11ef3b2a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ed0c10-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:00:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443502, 'reachable_time': 36165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223920, 'error': None, 'target': 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.757 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f6da40f1-0670-4764-982b-284863ffd4ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.814 183134 DEBUG nova.compute.manager [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.815 183134 DEBUG oslo_concurrency.lockutils [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.815 183134 DEBUG oslo_concurrency.lockutils [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.816 183134 DEBUG oslo_concurrency.lockutils [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.816 183134 DEBUG nova.compute.manager [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.817 183134 WARNING nova.compute.manager [req-ab2f0698-cf4c-46e9-b03b-bbd2deba2a84 req-01d1a8d2-34e1-49e5-9b67-b511792f0f47 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received unexpected event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.820 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[da791493-0692-40d9-a4fa-44d4c9c1bb78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.822 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ed0c10-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.822 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.823 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ed0c10-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 kernel: tapb8ed0c10-f0: entered promiscuous mode
Jan 30 04:38:23 np0005601977 NetworkManager[55565]: <info>  [1769765903.8272] manager: (tapb8ed0c10-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.826 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.830 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ed0c10-f0, col_values=(('external_ids', {'iface-id': '75d5640e-14a0-4765-922f-fb52aa3f3de5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:23 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:23Z|00433|binding|INFO|Releasing lport 75d5640e-14a0-4765-922f-fb52aa3f3de5 from this chassis (sb_readonly=0)
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.834 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.835 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[102bc3ad-9f38-459d-a9b1-a29ad4343189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.836 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd.pid.haproxy
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:38:23 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:23.837 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'env', 'PROCESS_TAG=haproxy-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:38:23 np0005601977 nova_compute[183130]: 2026-01-30 09:38:23.839 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.142 183134 DEBUG nova.virt.libvirt.host [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Removed pending event for 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.144 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765904.1420143, 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.144 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] VM Started (Lifecycle Event)#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.164 183134 DEBUG nova.compute.manager [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.165 183134 DEBUG nova.objects.instance [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.168 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.171 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.189 183134 INFO nova.virt.libvirt.driver [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Instance running successfully.#033[00m
Jan 30 04:38:24 np0005601977 virtqemud[182587]: argument unsupported: QEMU guest agent is not configured
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.191 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.192 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769765904.1523988, 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.192 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.195 183134 DEBUG nova.virt.libvirt.guest [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.195 183134 DEBUG nova.compute.manager [None req-a4856b54-8392-4ffe-ae63-7732cbf6b5b3 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:24 np0005601977 podman[223959]: 2026-01-30 09:38:24.21586795 +0000 UTC m=+0.054586178 container create b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.222 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.226 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:38:24 np0005601977 nova_compute[183130]: 2026-01-30 09:38:24.248 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 30 04:38:24 np0005601977 systemd[1]: Started libpod-conmon-b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91.scope.
Jan 30 04:38:24 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:38:24 np0005601977 podman[223959]: 2026-01-30 09:38:24.188595967 +0000 UTC m=+0.027314255 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:38:24 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6385dc2ec386d130f6c6db56f84f760f5e19865ec1bf3b4dc517f14a75730157/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:38:24 np0005601977 podman[223959]: 2026-01-30 09:38:24.295909956 +0000 UTC m=+0.134628194 container init b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:38:24 np0005601977 podman[223959]: 2026-01-30 09:38:24.30056105 +0000 UTC m=+0.139279278 container start b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 30 04:38:24 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [NOTICE]   (223978) : New worker (223980) forked
Jan 30 04:38:24 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [NOTICE]   (223978) : Loading success.
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.810 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.894 183134 DEBUG nova.compute.manager [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.894 183134 DEBUG oslo_concurrency.lockutils [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.894 183134 DEBUG oslo_concurrency.lockutils [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.895 183134 DEBUG oslo_concurrency.lockutils [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.895 183134 DEBUG nova.compute.manager [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:25 np0005601977 nova_compute[183130]: 2026-01-30 09:38:25.895 183134 WARNING nova.compute.manager [req-3b610839-56af-49ee-b195-883c65f483ae req-e75c57ed-49d6-4656-8800-3f6b62840379 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received unexpected event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:38:26 np0005601977 nova_compute[183130]: 2026-01-30 09:38:26.073 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:26 np0005601977 nova_compute[183130]: 2026-01-30 09:38:26.143 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:26 np0005601977 podman[223989]: 2026-01-30 09:38:26.839672969 +0000 UTC m=+0.055560145 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64)
Jan 30 04:38:26 np0005601977 podman[223990]: 2026-01-30 09:38:26.868353601 +0000 UTC m=+0.079158952 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 30 04:38:27 np0005601977 nova_compute[183130]: 2026-01-30 09:38:27.558 183134 INFO nova.compute.manager [None req-957ef0f9-4c0e-4f01-891d-dcd95f0f9f08 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Get console output#033[00m
Jan 30 04:38:27 np0005601977 nova_compute[183130]: 2026-01-30 09:38:27.565 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.576 183134 DEBUG nova.compute.manager [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-changed-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.576 183134 DEBUG nova.compute.manager [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Refreshing instance network info cache due to event network-changed-13cb8e9e-bf8d-41e1-bc08-decc34875371. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.576 183134 DEBUG oslo_concurrency.lockutils [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.577 183134 DEBUG oslo_concurrency.lockutils [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.577 183134 DEBUG nova.network.neutron [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Refreshing network info cache for port 13cb8e9e-bf8d-41e1-bc08-decc34875371 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.662 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.663 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.663 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.663 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.663 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.664 183134 INFO nova.compute.manager [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Terminating instance#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.665 183134 DEBUG nova.compute.manager [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:38:29 np0005601977 kernel: tap13cb8e9e-bf (unregistering): left promiscuous mode
Jan 30 04:38:29 np0005601977 NetworkManager[55565]: <info>  [1769765909.6982] device (tap13cb8e9e-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:38:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:29Z|00434|binding|INFO|Releasing lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 from this chassis (sb_readonly=0)
Jan 30 04:38:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:29Z|00435|binding|INFO|Setting lport 13cb8e9e-bf8d-41e1-bc08-decc34875371 down in Southbound
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.705 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 ovn_controller[95460]: 2026-01-30T09:38:29Z|00436|binding|INFO|Removing iface tap13cb8e9e-bf ovn-installed in OVS
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.708 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.713 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:41:24 10.100.0.4'], port_security=['fa:16:3e:94:41:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f72b9cb-fbf3-42bf-b6c0-54640eec3466', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f3fcd6f23d74ceca8c3efd31a373f0b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '10dbf1a5-239e-4069-96b7-a4012c9cf2fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df6a84d2-e9f1-4587-9db0-e39b2b457cb8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=13cb8e9e-bf8d-41e1-bc08-decc34875371) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.714 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 13cb8e9e-bf8d-41e1-bc08-decc34875371 in datapath b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd unbound from our chassis#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.715 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.717 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8b56e4c9-95f1-48aa-bbf5-bb067f172976]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.717 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.718 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd namespace which is not needed anymore#033[00m
Jan 30 04:38:29 np0005601977 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 30 04:38:29 np0005601977 systemd-machined[154431]: Machine qemu-35-instance-00000028 terminated.
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [NOTICE]   (223978) : haproxy version is 2.8.14-c23fe91
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [NOTICE]   (223978) : path to executable is /usr/sbin/haproxy
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [WARNING]  (223978) : Exiting Master process...
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [WARNING]  (223978) : Exiting Master process...
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [ALERT]    (223978) : Current worker (223980) exited with code 143 (Terminated)
Jan 30 04:38:29 np0005601977 neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd[223974]: [WARNING]  (223978) : All workers exited. Exiting... (0)
Jan 30 04:38:29 np0005601977 systemd[1]: libpod-b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91.scope: Deactivated successfully.
Jan 30 04:38:29 np0005601977 podman[224057]: 2026-01-30 09:38:29.85601027 +0000 UTC m=+0.045614070 container died b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:38:29 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91-userdata-shm.mount: Deactivated successfully.
Jan 30 04:38:29 np0005601977 systemd[1]: var-lib-containers-storage-overlay-6385dc2ec386d130f6c6db56f84f760f5e19865ec1bf3b4dc517f14a75730157-merged.mount: Deactivated successfully.
Jan 30 04:38:29 np0005601977 podman[224057]: 2026-01-30 09:38:29.890711775 +0000 UTC m=+0.080315575 container cleanup b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:38:29 np0005601977 systemd[1]: libpod-conmon-b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91.scope: Deactivated successfully.
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.917 183134 INFO nova.virt.libvirt.driver [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Instance destroyed successfully.#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.919 183134 DEBUG nova.objects.instance [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lazy-loading 'resources' on Instance uuid 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.946 183134 DEBUG nova.virt.libvirt.vif [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:37:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-460321422',display_name='tempest-TestNetworkAdvancedServerOps-server-460321422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-460321422',id=40,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGxLYkNGOqQwlLHoN/jF3P1E0fyGbR8h2qexDDjLzmClyLd602v6Tys94o/gxnd87uORMB/wQUYmXxB5ay6W3ZYlQW0+9mEoEPhdU9ZjFEq00wOsbR3dZVV9qeRRZCOrw==',key_name='tempest-TestNetworkAdvancedServerOps-1189635368',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:37:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3f3fcd6f23d74ceca8c3efd31a373f0b',ramdisk_id='',reservation_id='r-nhk7d60u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-856785562',owner_user_name='tempest-TestNetworkAdvancedServerOps-856785562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:38:24Z,user_data=None,user_id='67d560d0067b4b56aa346073fcc16d6d',uuid=3f72b9cb-fbf3-42bf-b6c0-54640eec3466,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.947 183134 DEBUG nova.network.os_vif_util [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converting VIF {"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.947 183134 DEBUG nova.network.os_vif_util [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.948 183134 DEBUG os_vif [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.949 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.949 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13cb8e9e-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:29 np0005601977 podman[224097]: 2026-01-30 09:38:29.950072008 +0000 UTC m=+0.036156328 container remove b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.952 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.953 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.953 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[11caaaea-177b-4769-8fab-792651d133dc]: (4, ('Fri Jan 30 09:38:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd (b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91)\nb4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91\nFri Jan 30 09:38:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd (b4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91)\nb4189ce5d7e0c03498d0c72480e683fd9c027445a92d624a9d782ef5cf5d9e91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.954 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a8e7cd95-a275-4644-9c8f-2e571ce4536a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.955 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ed0c10-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.955 183134 INFO os_vif [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:41:24,bridge_name='br-int',has_traffic_filtering=True,id=13cb8e9e-bf8d-41e1-bc08-decc34875371,network=Network(b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cb8e9e-bf')#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.956 183134 INFO nova.virt.libvirt.driver [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Deleting instance files /var/lib/nova/instances/3f72b9cb-fbf3-42bf-b6c0-54640eec3466_del#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.956 183134 INFO nova.virt.libvirt.driver [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Deletion of /var/lib/nova/instances/3f72b9cb-fbf3-42bf-b6c0-54640eec3466_del complete#033[00m
Jan 30 04:38:29 np0005601977 kernel: tapb8ed0c10-f0: left promiscuous mode
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.958 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.961 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 nova_compute[183130]: 2026-01-30 09:38:29.962 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.962 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[266f7ab3-8951-4f9c-be06-1a30b7558e43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.984 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[30b85273-eba4-4a9f-90f9-f93e6078b33a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.985 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6c52ee01-720b-445c-bc5e-95b06eafa0cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.995 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ed786776-777e-441e-9d9d-5e89954707eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443496, 'reachable_time': 30926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224119, 'error': None, 'target': 'ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:29 np0005601977 systemd[1]: run-netns-ovnmeta\x2db8ed0c10\x2dfb1a\x2d495f\x2da2dc\x2dfb5a165d08fd.mount: Deactivated successfully.
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.999 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:38:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:29.999 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd90efb-98d3-40ce-8f92-6a206f7ea20b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.044 183134 INFO nova.compute.manager [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.045 183134 DEBUG oslo.service.loopingcall [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.045 183134 DEBUG nova.compute.manager [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.046 183134 DEBUG nova.network.neutron [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.213 183134 DEBUG nova.compute.manager [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.213 183134 DEBUG oslo_concurrency.lockutils [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.214 183134 DEBUG oslo_concurrency.lockutils [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.214 183134 DEBUG oslo_concurrency.lockutils [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.215 183134 DEBUG nova.compute.manager [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:30 np0005601977 nova_compute[183130]: 2026-01-30 09:38:30.215 183134 DEBUG nova.compute.manager [req-fc3e67ac-f86e-46b0-b0eb-276f07bb23f1 req-dc1feeee-9d9e-46a1-8560-8b2fc0dfe6de dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-unplugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:38:31 np0005601977 nova_compute[183130]: 2026-01-30 09:38:31.075 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.895 183134 DEBUG nova.compute.manager [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.895 183134 DEBUG oslo_concurrency.lockutils [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.896 183134 DEBUG oslo_concurrency.lockutils [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.896 183134 DEBUG oslo_concurrency.lockutils [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.896 183134 DEBUG nova.compute.manager [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] No waiting events found dispatching network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:38:32 np0005601977 nova_compute[183130]: 2026-01-30 09:38:32.896 183134 WARNING nova.compute.manager [req-2eadd2bd-be1d-45f5-ad27-f6d5ea60f961 req-e7f259eb-6e70-4463-93a2-32b236d03e29 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received unexpected event network-vif-plugged-13cb8e9e-bf8d-41e1-bc08-decc34875371 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:38:33 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:33.301 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:a8:ae 2001:db8:0:1:f816:3eff:fe92:a8ae 2001:db8::f816:3eff:fe92:a8ae'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe92:a8ae/64 2001:db8::f816:3eff:fe92:a8ae/64', 'neutron:device_id': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28f4e804-b114-47c5-adeb-b6f124a69855, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a360118c-35c8-4b21-89e2-7d8f63dcd10b) old=Port_Binding(mac=['fa:16:3e:92:a8:ae 2001:db8::f816:3eff:fe92:a8ae'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe92:a8ae/64', 'neutron:device_id': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:38:33 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:33.302 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a360118c-35c8-4b21-89e2-7d8f63dcd10b in datapath 925994ed-d8a1-422b-a2d5-57ed39eb5751 updated#033[00m
Jan 30 04:38:33 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:33.303 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 925994ed-d8a1-422b-a2d5-57ed39eb5751, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:38:33 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:33.304 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b0080610-71f6-463e-b1d6-ebaf6f1cbc5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.668 183134 DEBUG nova.network.neutron [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.715 183134 INFO nova.compute.manager [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Took 3.67 seconds to deallocate network for instance.#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.766 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.767 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:33 np0005601977 podman[224120]: 2026-01-30 09:38:33.840142358 +0000 UTC m=+0.055477852 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.842 183134 DEBUG nova.compute.provider_tree [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.869 183134 DEBUG nova.scheduler.client.report [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:38:33 np0005601977 podman[224121]: 2026-01-30 09:38:33.870434108 +0000 UTC m=+0.084114655 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.922 183134 DEBUG nova.network.neutron [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Updated VIF entry in instance network info cache for port 13cb8e9e-bf8d-41e1-bc08-decc34875371. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.923 183134 DEBUG nova.network.neutron [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Updating instance_info_cache with network_info: [{"id": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "address": "fa:16:3e:94:41:24", "network": {"id": "b8ed0c10-fb1a-495f-a2dc-fb5a165d08fd", "bridge": "br-int", "label": "tempest-network-smoke--1871691314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3f3fcd6f23d74ceca8c3efd31a373f0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cb8e9e-bf", "ovs_interfaceid": "13cb8e9e-bf8d-41e1-bc08-decc34875371", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:38:33 np0005601977 nova_compute[183130]: 2026-01-30 09:38:33.964 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.004 183134 DEBUG oslo_concurrency.lockutils [req-3bc7aced-acb3-4b63-8ff3-27e6d1e0716c req-cabc6f50-2696-4e1a-b25c-1b2d44e297bd dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-3f72b9cb-fbf3-42bf-b6c0-54640eec3466" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.015 183134 INFO nova.scheduler.client.report [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Deleted allocations for instance 3f72b9cb-fbf3-42bf-b6c0-54640eec3466#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.111 183134 DEBUG oslo_concurrency.lockutils [None req-935e7c46-a6d6-4c84-94bb-d0a17656ea11 67d560d0067b4b56aa346073fcc16d6d 3f3fcd6f23d74ceca8c3efd31a373f0b - - default default] Lock "3f72b9cb-fbf3-42bf-b6c0-54640eec3466" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.953 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.998 183134 DEBUG nova.compute.manager [req-084978a7-131f-43bb-8771-b8ae5aad71ae req-cf750b64-888a-4eea-82e6-34186153df1f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Received event network-vif-deleted-13cb8e9e-bf8d-41e1-bc08-decc34875371 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.999 183134 INFO nova.compute.manager [req-084978a7-131f-43bb-8771-b8ae5aad71ae req-cf750b64-888a-4eea-82e6-34186153df1f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Neutron deleted interface 13cb8e9e-bf8d-41e1-bc08-decc34875371; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:38:34 np0005601977 nova_compute[183130]: 2026-01-30 09:38:34.999 183134 DEBUG nova.network.neutron [req-084978a7-131f-43bb-8771-b8ae5aad71ae req-cf750b64-888a-4eea-82e6-34186153df1f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 30 04:38:35 np0005601977 nova_compute[183130]: 2026-01-30 09:38:35.001 183134 DEBUG nova.compute.manager [req-084978a7-131f-43bb-8771-b8ae5aad71ae req-cf750b64-888a-4eea-82e6-34186153df1f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Detach interface failed, port_id=13cb8e9e-bf8d-41e1-bc08-decc34875371, reason: Instance 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:38:36 np0005601977 nova_compute[183130]: 2026-01-30 09:38:36.077 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:36 np0005601977 nova_compute[183130]: 2026-01-30 09:38:36.226 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:36 np0005601977 nova_compute[183130]: 2026-01-30 09:38:36.284 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:37 np0005601977 podman[224161]: 2026-01-30 09:38:37.913381963 +0000 UTC m=+0.133548522 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:38:39 np0005601977 nova_compute[183130]: 2026-01-30 09:38:39.955 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:41 np0005601977 nova_compute[183130]: 2026-01-30 09:38:41.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:44 np0005601977 nova_compute[183130]: 2026-01-30 09:38:44.917 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769765909.9158359, 3f72b9cb-fbf3-42bf-b6c0-54640eec3466 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:38:44 np0005601977 nova_compute[183130]: 2026-01-30 09:38:44.918 183134 INFO nova.compute.manager [-] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:38:44 np0005601977 nova_compute[183130]: 2026-01-30 09:38:44.946 183134 DEBUG nova.compute.manager [None req-ba8bdb2f-8ea6-41c5-9d41-ad610eea6ebe - - - - - -] [instance: 3f72b9cb-fbf3-42bf-b6c0-54640eec3466] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:38:44 np0005601977 nova_compute[183130]: 2026-01-30 09:38:44.993 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:46 np0005601977 nova_compute[183130]: 2026-01-30 09:38:46.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:47 np0005601977 podman[224187]: 2026-01-30 09:38:47.835107539 +0000 UTC m=+0.056261195 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:38:49 np0005601977 nova_compute[183130]: 2026-01-30 09:38:49.994 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:51 np0005601977 nova_compute[183130]: 2026-01-30 09:38:51.132 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:54 np0005601977 nova_compute[183130]: 2026-01-30 09:38:54.996 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:56 np0005601977 nova_compute[183130]: 2026-01-30 09:38:56.133 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:38:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:57.391 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:57.392 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:38:57.392 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:57 np0005601977 podman[224212]: 2026-01-30 09:38:57.858069798 +0000 UTC m=+0.067207199 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:38:57 np0005601977 podman[224211]: 2026-01-30 09:38:57.858875281 +0000 UTC m=+0.077094083 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, container_name=openstack_network_exporter)
Jan 30 04:38:59 np0005601977 nova_compute[183130]: 2026-01-30 09:38:59.997 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:01 np0005601977 nova_compute[183130]: 2026-01-30 09:39:01.136 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:02 np0005601977 nova_compute[183130]: 2026-01-30 09:39:02.258 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:02.258 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:39:02 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:02.260 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:39:04 np0005601977 podman[224256]: 2026-01-30 09:39:04.850326622 +0000 UTC m=+0.060523016 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:39:04 np0005601977 podman[224255]: 2026-01-30 09:39:04.867495395 +0000 UTC m=+0.083824925 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:39:04 np0005601977 nova_compute[183130]: 2026-01-30 09:39:04.999 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:06 np0005601977 nova_compute[183130]: 2026-01-30 09:39:06.137 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.366 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.366 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.537 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.539 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5626MB free_disk=73.24970626831055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.539 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.539 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.661 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.662 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.683 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.699 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.724 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:39:07 np0005601977 nova_compute[183130]: 2026-01-30 09:39:07.725 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:08 np0005601977 podman[224299]: 2026-01-30 09:39:08.945027053 +0000 UTC m=+0.157127629 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:39:10 np0005601977 nova_compute[183130]: 2026-01-30 09:39:10.040 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:10 np0005601977 nova_compute[183130]: 2026-01-30 09:39:10.726 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:11 np0005601977 nova_compute[183130]: 2026-01-30 09:39:11.170 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:11 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:11.262 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:39:14 np0005601977 nova_compute[183130]: 2026-01-30 09:39:14.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.041 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.338 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.357 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.357 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.357 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.358 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:15 np0005601977 nova_compute[183130]: 2026-01-30 09:39:15.358 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:39:16 np0005601977 nova_compute[183130]: 2026-01-30 09:39:16.171 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:16 np0005601977 nova_compute[183130]: 2026-01-30 09:39:16.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:16 np0005601977 nova_compute[183130]: 2026-01-30 09:39:16.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:39:16 np0005601977 nova_compute[183130]: 2026-01-30 09:39:16.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:39:16 np0005601977 nova_compute[183130]: 2026-01-30 09:39:16.381 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:39:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:39:17Z|00437|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 30 04:39:18 np0005601977 podman[224326]: 2026-01-30 09:39:18.828507051 +0000 UTC m=+0.048848392 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:39:19 np0005601977 nova_compute[183130]: 2026-01-30 09:39:19.376 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:20 np0005601977 nova_compute[183130]: 2026-01-30 09:39:20.043 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:21 np0005601977 nova_compute[183130]: 2026-01-30 09:39:21.173 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:25 np0005601977 nova_compute[183130]: 2026-01-30 09:39:25.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:26 np0005601977 nova_compute[183130]: 2026-01-30 09:39:26.215 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:28 np0005601977 podman[224351]: 2026-01-30 09:39:28.82812864 +0000 UTC m=+0.049549113 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Jan 30 04:39:28 np0005601977 podman[224352]: 2026-01-30 09:39:28.828756198 +0000 UTC m=+0.048673398 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:39:30 np0005601977 nova_compute[183130]: 2026-01-30 09:39:30.081 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:31 np0005601977 nova_compute[183130]: 2026-01-30 09:39:31.215 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:35 np0005601977 nova_compute[183130]: 2026-01-30 09:39:35.117 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:35 np0005601977 podman[224392]: 2026-01-30 09:39:35.194025113 +0000 UTC m=+0.054346560 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:39:35 np0005601977 podman[224393]: 2026-01-30 09:39:35.193963082 +0000 UTC m=+0.048688508 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:39:36 np0005601977 nova_compute[183130]: 2026-01-30 09:39:36.247 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:39 np0005601977 podman[224435]: 2026-01-30 09:39:39.876004984 +0000 UTC m=+0.091235589 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:39:40 np0005601977 nova_compute[183130]: 2026-01-30 09:39:40.122 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:41 np0005601977 nova_compute[183130]: 2026-01-30 09:39:41.249 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:45 np0005601977 nova_compute[183130]: 2026-01-30 09:39:45.124 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:46 np0005601977 nova_compute[183130]: 2026-01-30 09:39:46.250 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:49 np0005601977 podman[224461]: 2026-01-30 09:39:49.860834848 +0000 UTC m=+0.075782056 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:39:50 np0005601977 nova_compute[183130]: 2026-01-30 09:39:50.124 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:51 np0005601977 nova_compute[183130]: 2026-01-30 09:39:51.252 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:55 np0005601977 nova_compute[183130]: 2026-01-30 09:39:55.126 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:39:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:56 np0005601977 nova_compute[183130]: 2026-01-30 09:39:56.255 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:39:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:57.393 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:57.394 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:39:57.395 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:59 np0005601977 podman[224485]: 2026-01-30 09:39:59.837293693 +0000 UTC m=+0.053709392 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1769056855, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Jan 30 04:39:59 np0005601977 podman[224486]: 2026-01-30 09:39:59.858322316 +0000 UTC m=+0.069031811 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:40:00 np0005601977 nova_compute[183130]: 2026-01-30 09:40:00.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:01 np0005601977 nova_compute[183130]: 2026-01-30 09:40:01.256 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:05 np0005601977 nova_compute[183130]: 2026-01-30 09:40:05.129 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:05 np0005601977 podman[224525]: 2026-01-30 09:40:05.862551763 +0000 UTC m=+0.070698010 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 30 04:40:05 np0005601977 podman[224526]: 2026-01-30 09:40:05.86978425 +0000 UTC m=+0.073824019 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:40:06 np0005601977 nova_compute[183130]: 2026-01-30 09:40:06.257 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.420 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.420 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.420 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.420 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.594 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.595 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.24970626831055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.595 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.595 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.710 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.710 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.729 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.839 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.842 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:40:07 np0005601977 nova_compute[183130]: 2026-01-30 09:40:07.842 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:40:10 np0005601977 nova_compute[183130]: 2026-01-30 09:40:10.131 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:10 np0005601977 podman[224570]: 2026-01-30 09:40:10.846548397 +0000 UTC m=+0.070239045 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 30 04:40:11 np0005601977 nova_compute[183130]: 2026-01-30 09:40:11.261 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:11 np0005601977 nova_compute[183130]: 2026-01-30 09:40:11.841 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:14 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:14.327 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:40:14 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:14.328 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:40:14 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:14.329 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:40:14 np0005601977 nova_compute[183130]: 2026-01-30 09:40:14.329 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:15 np0005601977 nova_compute[183130]: 2026-01-30 09:40:15.132 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:15 np0005601977 nova_compute[183130]: 2026-01-30 09:40:15.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:15 np0005601977 nova_compute[183130]: 2026-01-30 09:40:15.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:15 np0005601977 nova_compute[183130]: 2026-01-30 09:40:15.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:15 np0005601977 nova_compute[183130]: 2026-01-30 09:40:15.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:16 np0005601977 nova_compute[183130]: 2026-01-30 09:40:16.262 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:17 np0005601977 ovn_controller[95460]: 2026-01-30T09:40:17Z|00438|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.402 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.403 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:17 np0005601977 nova_compute[183130]: 2026-01-30 09:40:17.403 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:40:20 np0005601977 nova_compute[183130]: 2026-01-30 09:40:20.135 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:20 np0005601977 nova_compute[183130]: 2026-01-30 09:40:20.398 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:20 np0005601977 podman[224595]: 2026-01-30 09:40:20.841246162 +0000 UTC m=+0.056582577 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:40:21 np0005601977 nova_compute[183130]: 2026-01-30 09:40:21.295 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:25 np0005601977 nova_compute[183130]: 2026-01-30 09:40:25.135 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:26 np0005601977 nova_compute[183130]: 2026-01-30 09:40:26.351 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:30 np0005601977 nova_compute[183130]: 2026-01-30 09:40:30.137 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:30 np0005601977 podman[224619]: 2026-01-30 09:40:30.850161859 +0000 UTC m=+0.069040031 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.7, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Jan 30 04:40:30 np0005601977 podman[224620]: 2026-01-30 09:40:30.864488815 +0000 UTC m=+0.076108791 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 30 04:40:31 np0005601977 nova_compute[183130]: 2026-01-30 09:40:31.385 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:35 np0005601977 nova_compute[183130]: 2026-01-30 09:40:35.139 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:36 np0005601977 nova_compute[183130]: 2026-01-30 09:40:36.422 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:36 np0005601977 podman[224663]: 2026-01-30 09:40:36.850920525 +0000 UTC m=+0.060068447 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:40:36 np0005601977 podman[224662]: 2026-01-30 09:40:36.851276595 +0000 UTC m=+0.064170083 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 30 04:40:40 np0005601977 nova_compute[183130]: 2026-01-30 09:40:40.140 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:41 np0005601977 nova_compute[183130]: 2026-01-30 09:40:41.423 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:41 np0005601977 podman[224704]: 2026-01-30 09:40:41.849405234 +0000 UTC m=+0.069180154 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:40:45 np0005601977 nova_compute[183130]: 2026-01-30 09:40:45.141 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:46 np0005601977 nova_compute[183130]: 2026-01-30 09:40:46.500 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:50 np0005601977 nova_compute[183130]: 2026-01-30 09:40:50.182 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:51 np0005601977 nova_compute[183130]: 2026-01-30 09:40:51.514 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:51 np0005601977 podman[224730]: 2026-01-30 09:40:51.856982454 +0000 UTC m=+0.047329524 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:40:55 np0005601977 nova_compute[183130]: 2026-01-30 09:40:55.186 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:56 np0005601977 nova_compute[183130]: 2026-01-30 09:40:56.517 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:40:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:57.394 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:57.394 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:40:57.394 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:00 np0005601977 nova_compute[183130]: 2026-01-30 09:41:00.218 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:01 np0005601977 nova_compute[183130]: 2026-01-30 09:41:01.518 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:01 np0005601977 podman[224754]: 2026-01-30 09:41:01.834030868 +0000 UTC m=+0.051539554 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, build-date=2026-01-22T05:09:47Z, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:41:01 np0005601977 podman[224755]: 2026-01-30 09:41:01.83692872 +0000 UTC m=+0.051172804 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:41:05 np0005601977 nova_compute[183130]: 2026-01-30 09:41:05.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:06 np0005601977 nova_compute[183130]: 2026-01-30 09:41:06.519 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.362 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.362 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.362 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.362 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.512 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.515 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5693MB free_disk=73.24970626831055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.515 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.516 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.572 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.573 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.595 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.609 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.610 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:41:07 np0005601977 nova_compute[183130]: 2026-01-30 09:41:07.611 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:07 np0005601977 podman[224796]: 2026-01-30 09:41:07.837263283 +0000 UTC m=+0.057403021 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:41:07 np0005601977 podman[224797]: 2026-01-30 09:41:07.850976042 +0000 UTC m=+0.064867853 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:41:10 np0005601977 nova_compute[183130]: 2026-01-30 09:41:10.298 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:11 np0005601977 nova_compute[183130]: 2026-01-30 09:41:11.521 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:12 np0005601977 podman[224840]: 2026-01-30 09:41:12.884069704 +0000 UTC m=+0.106488534 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:41:13 np0005601977 nova_compute[183130]: 2026-01-30 09:41:13.611 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:14 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:14.421 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:41:14 np0005601977 nova_compute[183130]: 2026-01-30 09:41:14.421 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:14 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:14.423 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:41:15 np0005601977 nova_compute[183130]: 2026-01-30 09:41:15.331 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:15 np0005601977 nova_compute[183130]: 2026-01-30 09:41:15.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:15 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:15.424 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:41:16 np0005601977 nova_compute[183130]: 2026-01-30 09:41:16.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:16 np0005601977 nova_compute[183130]: 2026-01-30 09:41:16.552 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.360 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.360 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:17 np0005601977 nova_compute[183130]: 2026-01-30 09:41:17.360 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:18 np0005601977 nova_compute[183130]: 2026-01-30 09:41:18.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:19 np0005601977 nova_compute[183130]: 2026-01-30 09:41:19.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:19 np0005601977 nova_compute[183130]: 2026-01-30 09:41:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:41:20 np0005601977 nova_compute[183130]: 2026-01-30 09:41:20.335 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:21 np0005601977 nova_compute[183130]: 2026-01-30 09:41:21.558 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:22 np0005601977 nova_compute[183130]: 2026-01-30 09:41:22.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:22 np0005601977 podman[224867]: 2026-01-30 09:41:22.846970287 +0000 UTC m=+0.054552540 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:41:25 np0005601977 nova_compute[183130]: 2026-01-30 09:41:25.382 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:26 np0005601977 nova_compute[183130]: 2026-01-30 09:41:26.586 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:30 np0005601977 nova_compute[183130]: 2026-01-30 09:41:30.383 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:31 np0005601977 nova_compute[183130]: 2026-01-30 09:41:31.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:32 np0005601977 podman[224892]: 2026-01-30 09:41:32.855383082 +0000 UTC m=+0.072248712 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:41:32 np0005601977 podman[224891]: 2026-01-30 09:41:32.856703749 +0000 UTC m=+0.076416070 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1769056855, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z)
Jan 30 04:41:35 np0005601977 nova_compute[183130]: 2026-01-30 09:41:35.430 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:36 np0005601977 nova_compute[183130]: 2026-01-30 09:41:36.653 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:38 np0005601977 podman[224934]: 2026-01-30 09:41:38.831061094 +0000 UTC m=+0.042750205 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:41:38 np0005601977 podman[224933]: 2026-01-30 09:41:38.841911412 +0000 UTC m=+0.058004298 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:41:40 np0005601977 nova_compute[183130]: 2026-01-30 09:41:40.481 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:41 np0005601977 nova_compute[183130]: 2026-01-30 09:41:41.693 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:43 np0005601977 podman[224976]: 2026-01-30 09:41:43.865509855 +0000 UTC m=+0.078754127 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:41:45 np0005601977 ovn_controller[95460]: 2026-01-30T09:41:45Z|00439|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 30 04:41:45 np0005601977 nova_compute[183130]: 2026-01-30 09:41:45.530 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:46 np0005601977 nova_compute[183130]: 2026-01-30 09:41:46.733 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:50 np0005601977 nova_compute[183130]: 2026-01-30 09:41:50.534 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:51 np0005601977 nova_compute[183130]: 2026-01-30 09:41:51.737 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:53 np0005601977 podman[225002]: 2026-01-30 09:41:53.835042783 +0000 UTC m=+0.055850906 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:41:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601977 nova_compute[183130]: 2026-01-30 09:41:55.565 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:56 np0005601977 nova_compute[183130]: 2026-01-30 09:41:56.777 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:41:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:57.394 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:57.395 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:41:57.395 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:00 np0005601977 nova_compute[183130]: 2026-01-30 09:42:00.569 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:01 np0005601977 nova_compute[183130]: 2026-01-30 09:42:01.778 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:03 np0005601977 podman[225028]: 2026-01-30 09:42:03.836647634 +0000 UTC m=+0.050751271 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, release=1769056855, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:42:03 np0005601977 podman[225029]: 2026-01-30 09:42:03.871523074 +0000 UTC m=+0.081584176 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:42:05 np0005601977 nova_compute[183130]: 2026-01-30 09:42:05.620 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:06 np0005601977 nova_compute[183130]: 2026-01-30 09:42:06.783 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.378 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.378 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.538 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.539 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.24971389770508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.539 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.539 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.597 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.598 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.618 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.635 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.637 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:42:08 np0005601977 nova_compute[183130]: 2026-01-30 09:42:08.637 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:09 np0005601977 podman[225069]: 2026-01-30 09:42:09.858576121 +0000 UTC m=+0.064846612 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:42:09 np0005601977 podman[225068]: 2026-01-30 09:42:09.862703538 +0000 UTC m=+0.073221690 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:42:10 np0005601977 nova_compute[183130]: 2026-01-30 09:42:10.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:11 np0005601977 nova_compute[183130]: 2026-01-30 09:42:11.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:11 np0005601977 nova_compute[183130]: 2026-01-30 09:42:11.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:42:11 np0005601977 nova_compute[183130]: 2026-01-30 09:42:11.420 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:11 np0005601977 nova_compute[183130]: 2026-01-30 09:42:11.782 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:14 np0005601977 nova_compute[183130]: 2026-01-30 09:42:14.431 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:14 np0005601977 podman[225109]: 2026-01-30 09:42:14.909968841 +0000 UTC m=+0.123389114 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:42:15 np0005601977 nova_compute[183130]: 2026-01-30 09:42:15.664 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:16 np0005601977 nova_compute[183130]: 2026-01-30 09:42:16.839 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:17 np0005601977 nova_compute[183130]: 2026-01-30 09:42:17.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.360 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.360 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:18 np0005601977 nova_compute[183130]: 2026-01-30 09:42:18.361 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:19 np0005601977 nova_compute[183130]: 2026-01-30 09:42:19.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:19 np0005601977 nova_compute[183130]: 2026-01-30 09:42:19.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:19 np0005601977 nova_compute[183130]: 2026-01-30 09:42:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:42:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:19.608 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:42:19 np0005601977 nova_compute[183130]: 2026-01-30 09:42:19.609 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:19.610 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:42:20 np0005601977 nova_compute[183130]: 2026-01-30 09:42:20.666 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:21 np0005601977 nova_compute[183130]: 2026-01-30 09:42:21.842 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:22.612 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:23 np0005601977 nova_compute[183130]: 2026-01-30 09:42:23.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.613 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.614 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.637 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.723 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.723 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.732 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.732 183134 INFO nova.compute.claims [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:42:24 np0005601977 podman[225137]: 2026-01-30 09:42:24.83855205 +0000 UTC m=+0.057651108 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.841 183134 DEBUG nova.compute.provider_tree [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.858 183134 DEBUG nova.scheduler.client.report [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.880 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.881 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.926 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.927 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.951 183134 INFO nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:42:24 np0005601977 nova_compute[183130]: 2026-01-30 09:42:24.969 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.057 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.058 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.059 183134 INFO nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Creating image(s)#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.059 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.059 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.060 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.076 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.146 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.147 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.148 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.172 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.247 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.248 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.291 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.292 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.293 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.325 183134 DEBUG nova.policy [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.363 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.364 183134 DEBUG nova.virt.disk.api [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.364 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.420 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.421 183134 DEBUG nova.virt.disk.api [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.422 183134 DEBUG nova.objects.instance [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid b072fab9-d7da-4c12-927b-098cedc02d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.444 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.444 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Ensure instance console log exists: /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.445 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.445 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.446 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:25 np0005601977 nova_compute[183130]: 2026-01-30 09:42:25.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:26 np0005601977 nova_compute[183130]: 2026-01-30 09:42:26.681 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Successfully created port: e0f76f69-c08f-4fe7-8510-242c083536a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:42:26 np0005601977 nova_compute[183130]: 2026-01-30 09:42:26.844 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:27 np0005601977 nova_compute[183130]: 2026-01-30 09:42:27.894 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Successfully created port: c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:42:28 np0005601977 nova_compute[183130]: 2026-01-30 09:42:28.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:28 np0005601977 nova_compute[183130]: 2026-01-30 09:42:28.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:42:28 np0005601977 nova_compute[183130]: 2026-01-30 09:42:28.382 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:42:29 np0005601977 nova_compute[183130]: 2026-01-30 09:42:29.368 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Successfully updated port: e0f76f69-c08f-4fe7-8510-242c083536a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:42:30 np0005601977 nova_compute[183130]: 2026-01-30 09:42:30.654 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Successfully updated port: c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:42:30 np0005601977 nova_compute[183130]: 2026-01-30 09:42:30.672 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:30 np0005601977 nova_compute[183130]: 2026-01-30 09:42:30.680 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:42:30 np0005601977 nova_compute[183130]: 2026-01-30 09:42:30.681 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:42:30 np0005601977 nova_compute[183130]: 2026-01-30 09:42:30.681 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:42:31 np0005601977 nova_compute[183130]: 2026-01-30 09:42:31.383 183134 DEBUG nova.compute.manager [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:31 np0005601977 nova_compute[183130]: 2026-01-30 09:42:31.384 183134 DEBUG nova.compute.manager [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing instance network info cache due to event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:42:31 np0005601977 nova_compute[183130]: 2026-01-30 09:42:31.384 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:42:31 np0005601977 nova_compute[183130]: 2026-01-30 09:42:31.845 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:32 np0005601977 nova_compute[183130]: 2026-01-30 09:42:32.306 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:42:34 np0005601977 podman[225177]: 2026-01-30 09:42:34.824028273 +0000 UTC m=+0.047215961 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 30 04:42:34 np0005601977 podman[225176]: 2026-01-30 09:42:34.828257343 +0000 UTC m=+0.051862102 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7)
Jan 30 04:42:35 np0005601977 nova_compute[183130]: 2026-01-30 09:42:35.677 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:36 np0005601977 nova_compute[183130]: 2026-01-30 09:42:36.886 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.817 183134 DEBUG nova.network.neutron [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.893 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.894 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Instance network_info: |[{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.895 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.895 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.901 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Start _get_guest_xml network_info=[{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.907 183134 WARNING nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.913 183134 DEBUG nova.virt.libvirt.host [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.913 183134 DEBUG nova.virt.libvirt.host [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.917 183134 DEBUG nova.virt.libvirt.host [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.917 183134 DEBUG nova.virt.libvirt.host [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.918 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.918 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.919 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.919 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.919 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.919 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.919 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.920 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.920 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.920 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.920 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.921 183134 DEBUG nova.virt.hardware [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.923 183134 DEBUG nova.virt.libvirt.vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:42:25Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.923 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.924 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.925 183134 DEBUG nova.virt.libvirt.vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:42:25Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.925 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.926 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.927 183134 DEBUG nova.objects.instance [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid b072fab9-d7da-4c12-927b-098cedc02d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.942 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <uuid>b072fab9-d7da-4c12-927b-098cedc02d8c</uuid>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <name>instance-0000002b</name>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-1962787323</nova:name>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:42:37</nova:creationTime>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:port uuid="e0f76f69-c08f-4fe7-8510-242c083536a3">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        <nova:port uuid="c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe73:2e70" ipVersion="6"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe73:2e70" ipVersion="6"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="serial">b072fab9-d7da-4c12-927b-098cedc02d8c</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="uuid">b072fab9-d7da-4c12-927b-098cedc02d8c</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.config"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:ef:a3:7b"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <target dev="tape0f76f69-c0"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:73:2e:70"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <target dev="tapc7fc2fe7-f2"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/console.log" append="off"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:42:37 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:42:37 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:42:37 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:42:37 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Preparing to wait for external event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Preparing to wait for external event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.943 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.944 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.944 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.944 183134 DEBUG nova.virt.libvirt.vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:42:25Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.945 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.945 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.946 183134 DEBUG os_vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.946 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.946 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.947 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.949 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.949 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0f76f69-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.949 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0f76f69-c0, col_values=(('external_ids', {'iface-id': 'e0f76f69-c08f-4fe7-8510-242c083536a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:a3:7b', 'vm-uuid': 'b072fab9-d7da-4c12-927b-098cedc02d8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.985 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 NetworkManager[55565]: <info>  [1769766157.9879] manager: (tape0f76f69-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.989 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.993 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.994 183134 INFO os_vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0')#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.995 183134 DEBUG nova.virt.libvirt.vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:42:25Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.995 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.996 183134 DEBUG nova.network.os_vif_util [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.996 183134 DEBUG os_vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.996 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.997 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.997 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.998 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.998 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7fc2fe7-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:37 np0005601977 nova_compute[183130]: 2026-01-30 09:42:37.999 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7fc2fe7-f2, col_values=(('external_ids', {'iface-id': 'c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:2e:70', 'vm-uuid': 'b072fab9-d7da-4c12-927b-098cedc02d8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.000 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:38 np0005601977 NetworkManager[55565]: <info>  [1769766158.0020] manager: (tapc7fc2fe7-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.003 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.008 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.010 183134 INFO os_vif [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2')#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.082 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.082 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.082 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:ef:a3:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.083 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:73:2e:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:42:38 np0005601977 nova_compute[183130]: 2026-01-30 09:42:38.083 183134 INFO nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Using config drive#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.307 183134 INFO nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Creating config drive at /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.config#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.311 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7destkzb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.433 183134 DEBUG oslo_concurrency.processutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7destkzb" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.4974] manager: (tape0f76f69-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 30 04:42:39 np0005601977 kernel: tape0f76f69-c0: entered promiscuous mode
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00440|binding|INFO|Claiming lport e0f76f69-c08f-4fe7-8510-242c083536a3 for this chassis.
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.504 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00441|binding|INFO|e0f76f69-c08f-4fe7-8510-242c083536a3: Claiming fa:16:3e:ef:a3:7b 10.100.0.14
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5134] manager: (tapc7fc2fe7-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Jan 30 04:42:39 np0005601977 kernel: tapc7fc2fe7-f2: entered promiscuous mode
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.518 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00442|if_status|INFO|Not updating pb chassis for c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 now as sb is readonly
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5259] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.524 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5267] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.533 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:a3:7b 10.100.0.14'], port_security=['fa:16:3e:ef:a3:7b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b072fab9-d7da-4c12-927b-098cedc02d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2b94f91-8f9f-4983-a28c-8751f44f795a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e039d6c8-7c3a-4772-a3a9-336e22751ea8, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=e0f76f69-c08f-4fe7-8510-242c083536a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.534 104706 INFO neutron.agent.ovn.metadata.agent [-] Port e0f76f69-c08f-4fe7-8510-242c083536a3 in datapath 0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 bound to our chassis#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.535 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3#033[00m
Jan 30 04:42:39 np0005601977 systemd-udevd[225244]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:42:39 np0005601977 systemd-udevd[225245]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.545 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[29d8edcc-5ea5-4d55-a03e-890758550e98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.547 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0dfc6b0c-b1 in ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5489] device (tapc7fc2fe7-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5497] device (tapc7fc2fe7-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5514] device (tape0f76f69-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.551 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0dfc6b0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.551 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8cfcf832-1435-45d9-915d-dcaf83eb3b22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.5520] device (tape0f76f69-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.552 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[69fa84ed-39e0-48fb-82df-29781f0c06df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 systemd-machined[154431]: New machine qemu-36-instance-0000002b.
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.564 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[19aed956-ca7e-41c6-b7a6-2b779f4e140f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 systemd[1]: Started Virtual Machine qemu-36-instance-0000002b.
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.568 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00443|binding|INFO|Claiming lport c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 for this chassis.
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00444|binding|INFO|c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951: Claiming fa:16:3e:73:2e:70 2001:db8:0:1:f816:3eff:fe73:2e70 2001:db8::f816:3eff:fe73:2e70
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.578 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0f862f69-ee57-4d72-a53a-ccd1f50ee0db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00445|binding|INFO|Setting lport e0f76f69-c08f-4fe7-8510-242c083536a3 ovn-installed in OVS
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.581 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00446|binding|INFO|Setting lport e0f76f69-c08f-4fe7-8510-242c083536a3 up in Southbound
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00447|binding|INFO|Setting lport c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 ovn-installed in OVS
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.587 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:2e:70 2001:db8:0:1:f816:3eff:fe73:2e70 2001:db8::f816:3eff:fe73:2e70'], port_security=['fa:16:3e:73:2e:70 2001:db8:0:1:f816:3eff:fe73:2e70 2001:db8::f816:3eff:fe73:2e70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe73:2e70/64 2001:db8::f816:3eff:fe73:2e70/64', 'neutron:device_id': 'b072fab9-d7da-4c12-927b-098cedc02d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2b94f91-8f9f-4983-a28c-8751f44f795a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28f4e804-b114-47c5-adeb-b6f124a69855, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.588 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00448|binding|INFO|Setting lport c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 up in Southbound
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.601 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0495e101-ffb9-4ff9-93f0-7479ab2cfdc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.606 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[28377cae-561b-496d-8f51-d88e12a73d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.6072] manager: (tap0dfc6b0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.627 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b4237e31-9c3d-42cb-85e4-1d905e7ebdce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.630 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[eba5a123-3ccc-409a-94bc-288c94a440b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.6457] device (tap0dfc6b0c-b0): carrier: link connected
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.647 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5988184f-da36-49c1-a69e-85e6e28bb236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.659 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3502d7-7752-48c7-9ab2-e14983998967]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0dfc6b0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:0f:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469098, 'reachable_time': 29316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225279, 'error': None, 'target': 'ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.671 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[660aa875-cff0-4d50-afb8-89b012b0cdec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:fdb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469098, 'tstamp': 469098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225280, 'error': None, 'target': 'ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.685 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ec10ce12-abd8-45ab-b307-50af76556bbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0dfc6b0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:0f:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469098, 'reachable_time': 29316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225281, 'error': None, 'target': 'ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.706 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1395a06a-35d6-45b6-bd55-274d82bc2a6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.742 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[86877c94-80d9-40d3-bb11-9aa8d0ed248c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.743 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0dfc6b0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.744 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.744 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0dfc6b0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.746 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 NetworkManager[55565]: <info>  [1769766159.7469] manager: (tap0dfc6b0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 30 04:42:39 np0005601977 kernel: tap0dfc6b0c-b0: entered promiscuous mode
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.751 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0dfc6b0c-b0, col_values=(('external_ids', {'iface-id': '0586c817-2aea-4d37-9732-33597fb96cb6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.752 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:39Z|00449|binding|INFO|Releasing lport 0586c817-2aea-4d37-9732-33597fb96cb6 from this chassis (sb_readonly=0)
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.753 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.755 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:42:39 np0005601977 nova_compute[183130]: 2026-01-30 09:42:39.756 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.756 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4145c99e-a545-48d3-bd7a-de5457306dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.757 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3.pid.haproxy
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:42:39 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:39.758 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'env', 'PROCESS_TAG=haproxy-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.023 183134 DEBUG nova.compute.manager [req-4f501d30-fa32-43f1-9a70-35a4083fc2ed req-5f918882-48f4-4711-9556-9f74d82e37f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.024 183134 DEBUG oslo_concurrency.lockutils [req-4f501d30-fa32-43f1-9a70-35a4083fc2ed req-5f918882-48f4-4711-9556-9f74d82e37f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.024 183134 DEBUG oslo_concurrency.lockutils [req-4f501d30-fa32-43f1-9a70-35a4083fc2ed req-5f918882-48f4-4711-9556-9f74d82e37f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.025 183134 DEBUG oslo_concurrency.lockutils [req-4f501d30-fa32-43f1-9a70-35a4083fc2ed req-5f918882-48f4-4711-9556-9f74d82e37f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.025 183134 DEBUG nova.compute.manager [req-4f501d30-fa32-43f1-9a70-35a4083fc2ed req-5f918882-48f4-4711-9556-9f74d82e37f9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Processing event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:42:40 np0005601977 podman[225313]: 2026-01-30 09:42:40.049114037 +0000 UTC m=+0.043140766 container create 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:42:40 np0005601977 systemd[1]: Started libpod-conmon-4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348.scope.
Jan 30 04:42:40 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:42:40 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1609eae4357640eb8c01be1d584b8719d9abc73f2ccffe3816a9a3225f081fb6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:42:40 np0005601977 podman[225313]: 2026-01-30 09:42:40.120785331 +0000 UTC m=+0.114812080 container init 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:42:40 np0005601977 podman[225313]: 2026-01-30 09:42:40.024053585 +0000 UTC m=+0.018080334 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:42:40 np0005601977 podman[225313]: 2026-01-30 09:42:40.131802384 +0000 UTC m=+0.125829113 container start 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:42:40 np0005601977 podman[225329]: 2026-01-30 09:42:40.139026509 +0000 UTC m=+0.053611553 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:42:40 np0005601977 podman[225326]: 2026-01-30 09:42:40.154364985 +0000 UTC m=+0.066407137 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:42:40 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [NOTICE]   (225367) : New worker (225376) forked
Jan 30 04:42:40 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [NOTICE]   (225367) : Loading success.
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.176 104706 INFO neutron.agent.ovn.metadata.agent [-] Port c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 in datapath 925994ed-d8a1-422b-a2d5-57ed39eb5751 unbound from our chassis#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.179 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 925994ed-d8a1-422b-a2d5-57ed39eb5751#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.187 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[49c148ff-7cc5-40f4-8f18-1225f4fede4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.188 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap925994ed-d1 in ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.190 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap925994ed-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.190 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7f728f-bcd7-4fcc-bda1-65fe50e357f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.190 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9d558efe-96ee-4f87-beb8-8ba76cd33f75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.197 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7791ce-ebbc-43a5-92ec-ad31cdb0a25f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.218 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9d500b2f-f148-4093-8407-5000d19c469a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.246 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b56e59-86b6-4f69-a59f-8a106c293ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.251 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f65eed-15a5-4805-b046-723161ed5bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 systemd-udevd[225275]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:42:40 np0005601977 NetworkManager[55565]: <info>  [1769766160.2531] manager: (tap925994ed-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.267 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updated VIF entry in instance network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.268 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.270 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766160.2702963, b072fab9-d7da-4c12-927b-098cedc02d8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.271 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] VM Started (Lifecycle Event)#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.279 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ba295661-5cac-4b73-a252-992cd015c823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.283 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[eb38f58f-7b89-4ef0-8a0e-72e7fd9fee4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 NetworkManager[55565]: <info>  [1769766160.3027] device (tap925994ed-d0): carrier: link connected
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.306 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[3426d0e5-59d5-498f-958b-66f7870902ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.312 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.316 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766160.2711923, b072fab9-d7da-4c12-927b-098cedc02d8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.316 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.319 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.319 183134 DEBUG nova.compute.manager [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-changed-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.319 183134 DEBUG nova.compute.manager [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing instance network info cache due to event network-changed-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.319 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.319 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.320 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing network info cache for port c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.323 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7f0bc3-66dd-4302-ba64-866c68038fe0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap925994ed-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:a8:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469164, 'reachable_time': 17230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225404, 'error': None, 'target': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.335 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe3788b-6968-4d5d-b5bf-12ad09880674]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:a8ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469164, 'tstamp': 469164}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225405, 'error': None, 'target': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.350 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.350 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9f7022-573e-4115-bc60-c260d0f4e7ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap925994ed-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:a8:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469164, 'reachable_time': 17230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225406, 'error': None, 'target': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.355 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.375 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c44fc7cd-6f38-4f94-a4d9-bfd44f6144df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.388 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.404 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad046f0-27d2-4092-aec3-dc260a9674a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.405 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap925994ed-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.406 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.406 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap925994ed-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.464 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:40 np0005601977 kernel: tap925994ed-d0: entered promiscuous mode
Jan 30 04:42:40 np0005601977 NetworkManager[55565]: <info>  [1769766160.4663] manager: (tap925994ed-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.466 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.467 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap925994ed-d0, col_values=(('external_ids', {'iface-id': 'a360118c-35c8-4b21-89e2-7d8f63dcd10b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.468 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.469 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:40Z|00450|binding|INFO|Releasing lport a360118c-35c8-4b21-89e2-7d8f63dcd10b from this chassis (sb_readonly=0)
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.469 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/925994ed-d8a1-422b-a2d5-57ed39eb5751.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/925994ed-d8a1-422b-a2d5-57ed39eb5751.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.470 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[273ea530-e4d8-462f-8776-edf135f4c177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.471 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-925994ed-d8a1-422b-a2d5-57ed39eb5751
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/925994ed-d8a1-422b-a2d5-57ed39eb5751.pid.haproxy
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 925994ed-d8a1-422b-a2d5-57ed39eb5751
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:42:40 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:40.472 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'env', 'PROCESS_TAG=haproxy-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/925994ed-d8a1-422b-a2d5-57ed39eb5751.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:42:40 np0005601977 nova_compute[183130]: 2026-01-30 09:42:40.474 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:40 np0005601977 podman[225436]: 2026-01-30 09:42:40.844063276 +0000 UTC m=+0.041583762 container create 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:42:40 np0005601977 systemd[1]: Started libpod-conmon-9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985.scope.
Jan 30 04:42:40 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:42:40 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b760ecdfa8e17fd1330ef9e5474e3711b38506db8f8427305908685b90d3958/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:42:40 np0005601977 podman[225436]: 2026-01-30 09:42:40.900871369 +0000 UTC m=+0.098391875 container init 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 30 04:42:40 np0005601977 podman[225436]: 2026-01-30 09:42:40.907433215 +0000 UTC m=+0.104953701 container start 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:42:40 np0005601977 podman[225436]: 2026-01-30 09:42:40.825728225 +0000 UTC m=+0.023248731 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:42:40 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [NOTICE]   (225456) : New worker (225458) forked
Jan 30 04:42:40 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [NOTICE]   (225456) : Loading success.
Jan 30 04:42:41 np0005601977 nova_compute[183130]: 2026-01-30 09:42:41.931 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.458 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.459 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.460 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.460 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.461 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No event matching network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 in dict_keys([('network-vif-plugged', 'c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.461 183134 WARNING nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received unexpected event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.462 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.462 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.463 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.463 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.464 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Processing event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.464 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.465 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.465 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.466 183134 DEBUG oslo_concurrency.lockutils [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.466 183134 DEBUG nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No waiting events found dispatching network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.467 183134 WARNING nova.compute.manager [req-820343e2-350d-4679-a7d7-05a960358693 req-17b8e63f-28d0-4b5d-ac00-b322efc17b66 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received unexpected event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 for instance with vm_state building and task_state spawning.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.468 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.471 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766162.4713304, b072fab9-d7da-4c12-927b-098cedc02d8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.472 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.474 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.479 183134 INFO nova.virt.libvirt.driver [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Instance spawned successfully.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.479 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.492 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.497 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.501 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.501 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.501 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.501 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.502 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.502 183134 DEBUG nova.virt.libvirt.driver [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.548 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.562 183134 INFO nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Took 17.50 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.562 183134 DEBUG nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.636 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.641 183134 INFO nova.compute.manager [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Took 17.95 seconds to build instance.#033[00m
Jan 30 04:42:42 np0005601977 nova_compute[183130]: 2026-01-30 09:42:42.665 183134 DEBUG oslo_concurrency.lockutils [None req-247db0c3-2e53-4ae9-a114-4e57d4ca898e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:43 np0005601977 nova_compute[183130]: 2026-01-30 09:42:43.043 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:43 np0005601977 nova_compute[183130]: 2026-01-30 09:42:43.352 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updated VIF entry in instance network info cache for port c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:42:43 np0005601977 nova_compute[183130]: 2026-01-30 09:42:43.353 183134 DEBUG nova.network.neutron [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:42:43 np0005601977 nova_compute[183130]: 2026-01-30 09:42:43.376 183134 DEBUG oslo_concurrency.lockutils [req-d3174aeb-361b-4e3d-85ca-865b987de88a req-2fdc88aa-1020-429e-94be-f62b90a5d8d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:42:45 np0005601977 podman[225467]: 2026-01-30 09:42:45.852883229 +0000 UTC m=+0.073635871 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.781 183134 DEBUG nova.compute.manager [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.781 183134 DEBUG nova.compute.manager [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing instance network info cache due to event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.781 183134 DEBUG oslo_concurrency.lockutils [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.782 183134 DEBUG oslo_concurrency.lockutils [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.782 183134 DEBUG nova.network.neutron [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:42:46 np0005601977 nova_compute[183130]: 2026-01-30 09:42:46.937 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:47 np0005601977 nova_compute[183130]: 2026-01-30 09:42:47.935 183134 DEBUG nova.network.neutron [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updated VIF entry in instance network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:42:47 np0005601977 nova_compute[183130]: 2026-01-30 09:42:47.937 183134 DEBUG nova.network.neutron [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:42:47 np0005601977 nova_compute[183130]: 2026-01-30 09:42:47.960 183134 DEBUG oslo_concurrency.lockutils [req-c8801b52-d65f-4a2a-aaa6-f9be49cde418 req-43810ff8-58c6-4baa-8fb6-55140878bc3a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:42:48 np0005601977 nova_compute[183130]: 2026-01-30 09:42:48.087 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:52 np0005601977 nova_compute[183130]: 2026-01-30 09:42:52.000 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:53 np0005601977 nova_compute[183130]: 2026-01-30 09:42:53.091 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:54Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:a3:7b 10.100.0.14
Jan 30 04:42:54 np0005601977 ovn_controller[95460]: 2026-01-30T09:42:54Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:a3:7b 10.100.0.14
Jan 30 04:42:55 np0005601977 podman[225513]: 2026-01-30 09:42:55.825798566 +0000 UTC m=+0.046423249 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:42:57 np0005601977 nova_compute[183130]: 2026-01-30 09:42:57.050 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:42:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:57.396 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:57.397 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:42:57.397 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:58 np0005601977 nova_compute[183130]: 2026-01-30 09:42:58.094 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:02 np0005601977 nova_compute[183130]: 2026-01-30 09:43:02.052 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:03 np0005601977 nova_compute[183130]: 2026-01-30 09:43:03.097 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:05 np0005601977 podman[225537]: 2026-01-30 09:43:05.850992276 +0000 UTC m=+0.066215231 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 30 04:43:05 np0005601977 podman[225538]: 2026-01-30 09:43:05.855135913 +0000 UTC m=+0.066910070 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.656 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.656 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.657 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.657 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.657 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.659 183134 INFO nova.compute.manager [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Terminating instance#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.660 183134 DEBUG nova.compute.manager [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:43:06 np0005601977 kernel: tape0f76f69-c0 (unregistering): left promiscuous mode
Jan 30 04:43:06 np0005601977 NetworkManager[55565]: <info>  [1769766186.7037] device (tape0f76f69-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.739 183134 DEBUG nova.compute.manager [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.740 183134 DEBUG nova.compute.manager [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing instance network info cache due to event network-changed-e0f76f69-c08f-4fe7-8510-242c083536a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.740 183134 DEBUG oslo_concurrency.lockutils [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.740 183134 DEBUG oslo_concurrency.lockutils [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.741 183134 DEBUG nova.network.neutron [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Refreshing network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.744 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00451|binding|INFO|Releasing lport e0f76f69-c08f-4fe7-8510-242c083536a3 from this chassis (sb_readonly=0)
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00452|binding|INFO|Setting lport e0f76f69-c08f-4fe7-8510-242c083536a3 down in Southbound
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00453|binding|INFO|Removing iface tape0f76f69-c0 ovn-installed in OVS
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.751 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 kernel: tapc7fc2fe7-f2 (unregistering): left promiscuous mode
Jan 30 04:43:06 np0005601977 NetworkManager[55565]: <info>  [1769766186.7575] device (tapc7fc2fe7-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.758 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.758 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:a3:7b 10.100.0.14'], port_security=['fa:16:3e:ef:a3:7b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b072fab9-d7da-4c12-927b-098cedc02d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2b94f91-8f9f-4983-a28c-8751f44f795a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e039d6c8-7c3a-4772-a3a9-336e22751ea8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=e0f76f69-c08f-4fe7-8510-242c083536a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.786 104706 INFO neutron.agent.ovn.metadata.agent [-] Port e0f76f69-c08f-4fe7-8510-242c083536a3 in datapath 0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 unbound from our chassis#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.788 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.789 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbc7ec8-b1af-4305-9ca9-1a01783f3fee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.789 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 namespace which is not needed anymore#033[00m
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00454|binding|INFO|Releasing lport c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 from this chassis (sb_readonly=0)
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00455|binding|INFO|Setting lport c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 down in Southbound
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.794 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:06Z|00456|binding|INFO|Removing iface tapc7fc2fe7-f2 ovn-installed in OVS
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.796 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.798 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.802 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:2e:70 2001:db8:0:1:f816:3eff:fe73:2e70 2001:db8::f816:3eff:fe73:2e70'], port_security=['fa:16:3e:73:2e:70 2001:db8:0:1:f816:3eff:fe73:2e70 2001:db8::f816:3eff:fe73:2e70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe73:2e70/64 2001:db8::f816:3eff:fe73:2e70/64', 'neutron:device_id': 'b072fab9-d7da-4c12-927b-098cedc02d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2b94f91-8f9f-4983-a28c-8751f44f795a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28f4e804-b114-47c5-adeb-b6f124a69855, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:43:06 np0005601977 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 30 04:43:06 np0005601977 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000002b.scope: Consumed 13.376s CPU time.
Jan 30 04:43:06 np0005601977 systemd-machined[154431]: Machine qemu-36-instance-0000002b terminated.
Jan 30 04:43:06 np0005601977 NetworkManager[55565]: <info>  [1769766186.8749] manager: (tape0f76f69-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.879 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.885 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [NOTICE]   (225367) : haproxy version is 2.8.14-c23fe91
Jan 30 04:43:06 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [NOTICE]   (225367) : path to executable is /usr/sbin/haproxy
Jan 30 04:43:06 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [WARNING]  (225367) : Exiting Master process...
Jan 30 04:43:06 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [ALERT]    (225367) : Current worker (225376) exited with code 143 (Terminated)
Jan 30 04:43:06 np0005601977 neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3[225335]: [WARNING]  (225367) : All workers exited. Exiting... (0)
Jan 30 04:43:06 np0005601977 systemd[1]: libpod-4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348.scope: Deactivated successfully.
Jan 30 04:43:06 np0005601977 podman[225607]: 2026-01-30 09:43:06.910562077 +0000 UTC m=+0.049656210 container died 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.925 183134 INFO nova.virt.libvirt.driver [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Instance destroyed successfully.#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.926 183134 DEBUG nova.objects.instance [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid b072fab9-d7da-4c12-927b-098cedc02d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:43:06 np0005601977 systemd[1]: var-lib-containers-storage-overlay-1609eae4357640eb8c01be1d584b8719d9abc73f2ccffe3816a9a3225f081fb6-merged.mount: Deactivated successfully.
Jan 30 04:43:06 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348-userdata-shm.mount: Deactivated successfully.
Jan 30 04:43:06 np0005601977 podman[225607]: 2026-01-30 09:43:06.941415423 +0000 UTC m=+0.080509556 container cleanup 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.940 183134 DEBUG nova.virt.libvirt.vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:42:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:42:42Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.942 183134 DEBUG nova.network.os_vif_util [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.943 183134 DEBUG nova.network.os_vif_util [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.944 183134 DEBUG os_vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:43:06 np0005601977 systemd[1]: libpod-conmon-4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348.scope: Deactivated successfully.
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.947 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.948 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0f76f69-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.949 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.952 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.953 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.955 183134 INFO os_vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=e0f76f69-c08f-4fe7-8510-242c083536a3,network=Network(0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0f76f69-c0')#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.956 183134 DEBUG nova.virt.libvirt.vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:42:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1962787323',display_name='tempest-TestGettingAddress-server-1962787323',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1962787323',id=43,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGQN5IaDi2xGyVjAX/ZPyfqf80B/mZpi5CvkyF/9kU8ksDUARQSriRKqgWtI0cmvP25gjZev1kjQJo8d0kxPvhjTOgpMuecvWSbrinqi+RCatLf0uBBen7DaqDcX6df2g==',key_name='tempest-TestGettingAddress-666009288',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:42:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-upx0hesa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:42:42Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b072fab9-d7da-4c12-927b-098cedc02d8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.956 183134 DEBUG nova.network.os_vif_util [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.957 183134 DEBUG nova.network.os_vif_util [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.958 183134 DEBUG os_vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.959 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.959 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7fc2fe7-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.961 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.964 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.966 183134 INFO os_vif [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:2e:70,bridge_name='br-int',has_traffic_filtering=True,id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951,network=Network(925994ed-d8a1-422b-a2d5-57ed39eb5751),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7fc2fe7-f2')#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.966 183134 INFO nova.virt.libvirt.driver [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Deleting instance files /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c_del#033[00m
Jan 30 04:43:06 np0005601977 nova_compute[183130]: 2026-01-30 09:43:06.967 183134 INFO nova.virt.libvirt.driver [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Deletion of /var/lib/nova/instances/b072fab9-d7da-4c12-927b-098cedc02d8c_del complete#033[00m
Jan 30 04:43:06 np0005601977 podman[225665]: 2026-01-30 09:43:06.994306765 +0000 UTC m=+0.036696953 container remove 4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:43:06 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.998 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f469d5ac-7f60-4daa-9f0e-1500f94816e4]: (4, ('Fri Jan 30 09:43:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 (4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348)\n4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348\nFri Jan 30 09:43:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 (4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348)\n4c2b7ad83dbe46f93343e6dc17c17a6b1a392d2668076f1343f4e8f3b79e3348\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:06.999 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fac23ea2-756c-4f81-99c0-755b26614e80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.000 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0dfc6b0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.002 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 kernel: tap0dfc6b0c-b0: left promiscuous mode
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.006 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.008 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd77d84-bbb5-4da0-b9ea-f3dbfc2dbe56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.010 183134 INFO nova.compute.manager [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.011 183134 DEBUG oslo.service.loopingcall [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.011 183134 DEBUG nova.compute.manager [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.011 183134 DEBUG nova.network.neutron [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.028 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7599ef5c-f8c9-4b85-9c1b-d559e3cc1b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.029 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[44b7df6f-e70c-4c3d-80c4-0dc2ddc4c8f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.041 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d32bdefb-1d23-4643-9029-c0fee6a66c8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469093, 'reachable_time': 27982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225682, 'error': None, 'target': 'ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 systemd[1]: run-netns-ovnmeta\x2d0dfc6b0c\x2dbbb3\x2d4a28\x2d8658\x2dca97aedc48d3.mount: Deactivated successfully.
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.045 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.045 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c1d10e-f891-4631-b4a4-0c4437ccb547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.047 104706 INFO neutron.agent.ovn.metadata.agent [-] Port c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 in datapath 925994ed-d8a1-422b-a2d5-57ed39eb5751 unbound from our chassis#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.048 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 925994ed-d8a1-422b-a2d5-57ed39eb5751, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.049 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[314c7162-bd1c-4758-8c3b-a28f2a7e9f05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.049 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751 namespace which is not needed anymore#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.053 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [NOTICE]   (225456) : haproxy version is 2.8.14-c23fe91
Jan 30 04:43:07 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [NOTICE]   (225456) : path to executable is /usr/sbin/haproxy
Jan 30 04:43:07 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [WARNING]  (225456) : Exiting Master process...
Jan 30 04:43:07 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [ALERT]    (225456) : Current worker (225458) exited with code 143 (Terminated)
Jan 30 04:43:07 np0005601977 neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751[225452]: [WARNING]  (225456) : All workers exited. Exiting... (0)
Jan 30 04:43:07 np0005601977 systemd[1]: libpod-9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985.scope: Deactivated successfully.
Jan 30 04:43:07 np0005601977 podman[225701]: 2026-01-30 09:43:07.154456442 +0000 UTC m=+0.040954514 container died 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 30 04:43:07 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985-userdata-shm.mount: Deactivated successfully.
Jan 30 04:43:07 np0005601977 systemd[1]: var-lib-containers-storage-overlay-6b760ecdfa8e17fd1330ef9e5474e3711b38506db8f8427305908685b90d3958-merged.mount: Deactivated successfully.
Jan 30 04:43:07 np0005601977 podman[225701]: 2026-01-30 09:43:07.182877349 +0000 UTC m=+0.069375401 container cleanup 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 30 04:43:07 np0005601977 systemd[1]: libpod-conmon-9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985.scope: Deactivated successfully.
Jan 30 04:43:07 np0005601977 podman[225731]: 2026-01-30 09:43:07.233104745 +0000 UTC m=+0.034042388 container remove 9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.236 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3070e5a2-112f-4ce5-8042-cf23ba88d299]: (4, ('Fri Jan 30 09:43:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751 (9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985)\n9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985\nFri Jan 30 09:43:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751 (9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985)\n9645209e9ba3d0b25eb87ec0c25cfbe853fc19c3579f28ea52f084805ae80985\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.238 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2f909c92-2711-4ec9-964b-452e58f5b475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.239 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap925994ed-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.240 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 kernel: tap925994ed-d0: left promiscuous mode
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.241 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.245 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.244 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7edf0d-6ff7-44bf-b68b-0017b1387490]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.258 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a366a91e-ea81-4e0f-ad71-1f8584505015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.260 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dcfcff-7a5f-4aa5-a24b-4a31e2830792]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.271 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c2db06ac-cee9-4f86-b871-579d881af87d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469157, 'reachable_time': 27815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225746, 'error': None, 'target': 'ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.273 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-925994ed-d8a1-422b-a2d5-57ed39eb5751 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:43:07 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:07.273 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[c48d51b0-b61b-4f1e-9646-87c03786a49c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.511 183134 DEBUG nova.compute.manager [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-unplugged-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.511 183134 DEBUG oslo_concurrency.lockutils [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.512 183134 DEBUG oslo_concurrency.lockutils [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.512 183134 DEBUG oslo_concurrency.lockutils [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.513 183134 DEBUG nova.compute.manager [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No waiting events found dispatching network-vif-unplugged-e0f76f69-c08f-4fe7-8510-242c083536a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:43:07 np0005601977 nova_compute[183130]: 2026-01-30 09:43:07.513 183134 DEBUG nova.compute.manager [req-34efd22e-d048-46d6-b7a0-1d8b6df6e3f9 req-10b8454c-e2b5-4091-9b68-2d3c9c4d4bbb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-unplugged-e0f76f69-c08f-4fe7-8510-242c083536a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:43:07 np0005601977 systemd[1]: run-netns-ovnmeta\x2d925994ed\x2dd8a1\x2d422b\x2da2d5\x2d57ed39eb5751.mount: Deactivated successfully.
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.382 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.425 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.425 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.425 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.426 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.578 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.579 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5716MB free_disk=73.24949264526367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.579 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.580 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.710 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance b072fab9-d7da-4c12-927b-098cedc02d8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.711 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.711 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.782 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.841 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.842 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.863 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.877 183134 DEBUG nova.compute.manager [req-a80c451c-eab6-440b-ac8d-fca606a8b064 req-854c2d18-768d-4ce7-9d4f-fb6eb524c7c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-deleted-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.878 183134 INFO nova.compute.manager [req-a80c451c-eab6-440b-ac8d-fca606a8b064 req-854c2d18-768d-4ce7-9d4f-fb6eb524c7c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Neutron deleted interface c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.879 183134 DEBUG nova.network.neutron [req-a80c451c-eab6-440b-ac8d-fca606a8b064 req-854c2d18-768d-4ce7-9d4f-fb6eb524c7c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.893 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.908 183134 DEBUG nova.compute.manager [req-a80c451c-eab6-440b-ac8d-fca606a8b064 req-854c2d18-768d-4ce7-9d4f-fb6eb524c7c5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Detach interface failed, port_id=c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951, reason: Instance b072fab9-d7da-4c12-927b-098cedc02d8c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.934 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.950 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.973 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:43:08 np0005601977 nova_compute[183130]: 2026-01-30 09:43:08.973 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.600 183134 DEBUG nova.network.neutron [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updated VIF entry in instance network info cache for port e0f76f69-c08f-4fe7-8510-242c083536a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.601 183134 DEBUG nova.network.neutron [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [{"id": "e0f76f69-c08f-4fe7-8510-242c083536a3", "address": "fa:16:3e:ef:a3:7b", "network": {"id": "0dfc6b0c-bbb3-4a28-8658-ca97aedc48d3", "bridge": "br-int", "label": "tempest-network-smoke--803137985", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0f76f69-c0", "ovs_interfaceid": "e0f76f69-c08f-4fe7-8510-242c083536a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "address": "fa:16:3e:73:2e:70", "network": {"id": "925994ed-d8a1-422b-a2d5-57ed39eb5751", "bridge": "br-int", "label": "tempest-network-smoke--2097086675", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:2e70", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7fc2fe7-f2", "ovs_interfaceid": "c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.626 183134 DEBUG oslo_concurrency.lockutils [req-0f70eb37-32ad-42e4-8ea2-d6ebe34bae02 req-afacdfa7-47e0-43f4-8db6-d9cc8fc112d5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b072fab9-d7da-4c12-927b-098cedc02d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.662 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.662 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No waiting events found dispatching network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 WARNING nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received unexpected event network-vif-plugged-e0f76f69-c08f-4fe7-8510-242c083536a3 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-unplugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.663 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.664 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.664 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.664 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No waiting events found dispatching network-vif-unplugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.664 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-unplugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.664 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.665 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.665 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.665 183134 DEBUG oslo_concurrency.lockutils [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.665 183134 DEBUG nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] No waiting events found dispatching network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.665 183134 WARNING nova.compute.manager [req-8ae083e2-419a-4ff4-9871-651fd19ba70e req-bd3da060-4b2f-4a0c-9a01-75e16601df83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received unexpected event network-vif-plugged-c7fc2fe7-f24e-48bb-ad76-bf6b38ccf951 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.666 183134 DEBUG nova.network.neutron [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.682 183134 INFO nova.compute.manager [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Took 2.67 seconds to deallocate network for instance.#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.718 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.718 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.758 183134 DEBUG nova.compute.provider_tree [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.782 183134 DEBUG nova.scheduler.client.report [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.809 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.836 183134 INFO nova.scheduler.client.report [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance b072fab9-d7da-4c12-927b-098cedc02d8c#033[00m
Jan 30 04:43:09 np0005601977 nova_compute[183130]: 2026-01-30 09:43:09.896 183134 DEBUG oslo_concurrency.lockutils [None req-fdc64a6e-b972-43bb-905b-0d1741400ec4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b072fab9-d7da-4c12-927b-098cedc02d8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:10 np0005601977 podman[225748]: 2026-01-30 09:43:10.84787233 +0000 UTC m=+0.057320808 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:43:10 np0005601977 podman[225749]: 2026-01-30 09:43:10.851332118 +0000 UTC m=+0.057400890 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:43:10 np0005601977 nova_compute[183130]: 2026-01-30 09:43:10.950 183134 DEBUG nova.compute.manager [req-d5cad044-c84a-477f-8bf3-05a01abf6e83 req-9eac6087-c2f9-4f08-8770-fd5a88a87cf1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Received event network-vif-deleted-e0f76f69-c08f-4fe7-8510-242c083536a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:43:11 np0005601977 nova_compute[183130]: 2026-01-30 09:43:11.962 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:12 np0005601977 nova_compute[183130]: 2026-01-30 09:43:12.056 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:14 np0005601977 nova_compute[183130]: 2026-01-30 09:43:14.934 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:16 np0005601977 podman[225791]: 2026-01-30 09:43:16.961351435 +0000 UTC m=+0.178544280 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:43:16 np0005601977 nova_compute[183130]: 2026-01-30 09:43:16.965 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:17 np0005601977 nova_compute[183130]: 2026-01-30 09:43:17.058 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:18 np0005601977 nova_compute[183130]: 2026-01-30 09:43:18.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:18 np0005601977 nova_compute[183130]: 2026-01-30 09:43:18.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:18 np0005601977 nova_compute[183130]: 2026-01-30 09:43:18.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.358 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.358 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.359 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:19 np0005601977 nova_compute[183130]: 2026-01-30 09:43:19.359 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:43:21 np0005601977 nova_compute[183130]: 2026-01-30 09:43:21.921 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766186.9200435, b072fab9-d7da-4c12-927b-098cedc02d8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:43:21 np0005601977 nova_compute[183130]: 2026-01-30 09:43:21.922 183134 INFO nova.compute.manager [-] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:43:21 np0005601977 nova_compute[183130]: 2026-01-30 09:43:21.950 183134 DEBUG nova.compute.manager [None req-eb691dcd-6070-45c7-80e9-cde9296c70f6 - - - - - -] [instance: b072fab9-d7da-4c12-927b-098cedc02d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:43:22 np0005601977 nova_compute[183130]: 2026-01-30 09:43:22.003 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:22 np0005601977 nova_compute[183130]: 2026-01-30 09:43:22.061 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:22 np0005601977 nova_compute[183130]: 2026-01-30 09:43:22.355 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:25 np0005601977 nova_compute[183130]: 2026-01-30 09:43:25.361 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:26 np0005601977 podman[225818]: 2026-01-30 09:43:26.836590979 +0000 UTC m=+0.053645324 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:43:27 np0005601977 nova_compute[183130]: 2026-01-30 09:43:27.043 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:27 np0005601977 nova_compute[183130]: 2026-01-30 09:43:27.064 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:32 np0005601977 nova_compute[183130]: 2026-01-30 09:43:32.049 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:32 np0005601977 nova_compute[183130]: 2026-01-30 09:43:32.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:36 np0005601977 podman[225843]: 2026-01-30 09:43:36.848487273 +0000 UTC m=+0.062926188 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 30 04:43:36 np0005601977 podman[225842]: 2026-01-30 09:43:36.849721008 +0000 UTC m=+0.064177983 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., release=1769056855, io.openshift.expose-services=, config_id=openstack_network_exporter)
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.068 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.070 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.070 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.071 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.097 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:37 np0005601977 nova_compute[183130]: 2026-01-30 09:43:37.098 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:43:41 np0005601977 ovn_controller[95460]: 2026-01-30T09:43:41Z|00457|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Jan 30 04:43:41 np0005601977 podman[225882]: 2026-01-30 09:43:41.84652933 +0000 UTC m=+0.065574112 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 30 04:43:41 np0005601977 podman[225883]: 2026-01-30 09:43:41.852361586 +0000 UTC m=+0.067686283 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:43:42 np0005601977 nova_compute[183130]: 2026-01-30 09:43:42.099 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:47 np0005601977 nova_compute[183130]: 2026-01-30 09:43:47.101 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:47 np0005601977 podman[225927]: 2026-01-30 09:43:47.868001023 +0000 UTC m=+0.085951051 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:43:52 np0005601977 nova_compute[183130]: 2026-01-30 09:43:52.103 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:53.790 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:43:53 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:53.791 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:43:53 np0005601977 nova_compute[183130]: 2026-01-30 09:43:53.822 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:43:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:57 np0005601977 nova_compute[183130]: 2026-01-30 09:43:57.105 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:43:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:57.397 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:57.397 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:57.398 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:57 np0005601977 podman[225955]: 2026-01-30 09:43:57.859044886 +0000 UTC m=+0.067149247 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:43:59 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:43:59.794 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:02 np0005601977 nova_compute[183130]: 2026-01-30 09:44:02.107 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:06 np0005601977 nova_compute[183130]: 2026-01-30 09:44:06.830 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:06 np0005601977 nova_compute[183130]: 2026-01-30 09:44:06.876 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:07 np0005601977 nova_compute[183130]: 2026-01-30 09:44:07.109 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:07 np0005601977 podman[225980]: 2026-01-30 09:44:07.839161786 +0000 UTC m=+0.061689373 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, release=1769056855)
Jan 30 04:44:07 np0005601977 podman[225981]: 2026-01-30 09:44:07.857054664 +0000 UTC m=+0.070612546 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.374 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.374 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.503 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.503 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5733MB free_disk=73.24949264526367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.504 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.504 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.624 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.625 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.645 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.660 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.677 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:44:08 np0005601977 nova_compute[183130]: 2026-01-30 09:44:08.678 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:12 np0005601977 nova_compute[183130]: 2026-01-30 09:44:12.111 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:12 np0005601977 podman[226022]: 2026-01-30 09:44:12.839997261 +0000 UTC m=+0.055103966 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:44:12 np0005601977 podman[226023]: 2026-01-30 09:44:12.873198733 +0000 UTC m=+0.083759059 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:44:15 np0005601977 nova_compute[183130]: 2026-01-30 09:44:15.678 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:17 np0005601977 nova_compute[183130]: 2026-01-30 09:44:17.114 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:18 np0005601977 nova_compute[183130]: 2026-01-30 09:44:18.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:18 np0005601977 podman[226063]: 2026-01-30 09:44:18.886718862 +0000 UTC m=+0.101334218 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.578 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.579 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.579 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:19 np0005601977 nova_compute[183130]: 2026-01-30 09:44:19.580 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:44:20 np0005601977 nova_compute[183130]: 2026-01-30 09:44:20.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:21 np0005601977 nova_compute[183130]: 2026-01-30 09:44:21.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:22 np0005601977 nova_compute[183130]: 2026-01-30 09:44:22.116 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:26 np0005601977 nova_compute[183130]: 2026-01-30 09:44:26.339 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:27 np0005601977 nova_compute[183130]: 2026-01-30 09:44:27.116 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:28 np0005601977 podman[226089]: 2026-01-30 09:44:28.836986483 +0000 UTC m=+0.052352334 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:44:32 np0005601977 nova_compute[183130]: 2026-01-30 09:44:32.118 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.120 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.148 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.148 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.148 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.149 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:44:37 np0005601977 nova_compute[183130]: 2026-01-30 09:44:37.150 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:38 np0005601977 podman[226113]: 2026-01-30 09:44:38.84338443 +0000 UTC m=+0.061152946 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 30 04:44:38 np0005601977 podman[226114]: 2026-01-30 09:44:38.843638598 +0000 UTC m=+0.054279749 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 30 04:44:39 np0005601977 nova_compute[183130]: 2026-01-30 09:44:39.948 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:39 np0005601977 nova_compute[183130]: 2026-01-30 09:44:39.949 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:39 np0005601977 nova_compute[183130]: 2026-01-30 09:44:39.968 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.041 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.041 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.049 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.050 183134 INFO nova.compute.claims [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.196 183134 DEBUG nova.compute.provider_tree [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.211 183134 DEBUG nova.scheduler.client.report [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.237 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.238 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.282 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.283 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.304 183134 INFO nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.328 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.409 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.411 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.412 183134 INFO nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Creating image(s)#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.412 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.413 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.413 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.431 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.479 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.480 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.481 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.502 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.557 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.558 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.713 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk 1073741824" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.715 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.715 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.756 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.757 183134 DEBUG nova.virt.disk.api [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.758 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.803 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.804 183134 DEBUG nova.virt.disk.api [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.804 183134 DEBUG nova.objects.instance [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.818 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.818 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Ensure instance console log exists: /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.819 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.819 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:40 np0005601977 nova_compute[183130]: 2026-01-30 09:44:40.819 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:41 np0005601977 nova_compute[183130]: 2026-01-30 09:44:41.424 183134 DEBUG nova.policy [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.150 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.186 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.186 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.186 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.187 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:44:42 np0005601977 nova_compute[183130]: 2026-01-30 09:44:42.189 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:43 np0005601977 podman[226172]: 2026-01-30 09:44:43.834637757 +0000 UTC m=+0.048997378 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:44:43 np0005601977 podman[226171]: 2026-01-30 09:44:43.834835882 +0000 UTC m=+0.054573327 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:44:44 np0005601977 nova_compute[183130]: 2026-01-30 09:44:44.545 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Successfully created port: 57979c3b-0d30-474f-b9e6-16ccca270fbf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:44:46 np0005601977 nova_compute[183130]: 2026-01-30 09:44:46.433 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Successfully created port: 4088bc52-1be0-4b2d-91f5-7a4615232b92 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.188 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.190 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.648 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Successfully updated port: 57979c3b-0d30-474f-b9e6-16ccca270fbf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.743 183134 DEBUG nova.compute.manager [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.743 183134 DEBUG nova.compute.manager [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing instance network info cache due to event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.744 183134 DEBUG oslo_concurrency.lockutils [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.744 183134 DEBUG oslo_concurrency.lockutils [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:44:47 np0005601977 nova_compute[183130]: 2026-01-30 09:44:47.744 183134 DEBUG nova.network.neutron [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing network info cache for port 57979c3b-0d30-474f-b9e6-16ccca270fbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.005 183134 DEBUG nova.network.neutron [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.654 183134 DEBUG nova.network.neutron [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.675 183134 DEBUG oslo_concurrency.lockutils [req-3d140e22-17c9-4005-b1ac-b8f5399c0bcc req-974261f5-cf97-4cb2-b0ac-4b0a9b5eba94 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.768 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Successfully updated port: 4088bc52-1be0-4b2d-91f5-7a4615232b92 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.789 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.789 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.790 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:44:48 np0005601977 nova_compute[183130]: 2026-01-30 09:44:48.973 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:44:49 np0005601977 nova_compute[183130]: 2026-01-30 09:44:49.826 183134 DEBUG nova.compute.manager [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-changed-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:49 np0005601977 nova_compute[183130]: 2026-01-30 09:44:49.826 183134 DEBUG nova.compute.manager [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing instance network info cache due to event network-changed-4088bc52-1be0-4b2d-91f5-7a4615232b92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:44:49 np0005601977 nova_compute[183130]: 2026-01-30 09:44:49.826 183134 DEBUG oslo_concurrency.lockutils [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:44:49 np0005601977 podman[226213]: 2026-01-30 09:44:49.873925144 +0000 UTC m=+0.090947822 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.640 183134 DEBUG nova.network.neutron [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.661 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.662 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance network_info: |[{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.664 183134 DEBUG oslo_concurrency.lockutils [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.664 183134 DEBUG nova.network.neutron [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing network info cache for port 4088bc52-1be0-4b2d-91f5-7a4615232b92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.671 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Start _get_guest_xml network_info=[{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.679 183134 WARNING nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.687 183134 DEBUG nova.virt.libvirt.host [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.688 183134 DEBUG nova.virt.libvirt.host [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.699 183134 DEBUG nova.virt.libvirt.host [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.700 183134 DEBUG nova.virt.libvirt.host [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.702 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.703 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.704 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.704 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.705 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.705 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.705 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.706 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.707 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.707 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.708 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.708 183134 DEBUG nova.virt.hardware [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.715 183134 DEBUG nova.virt.libvirt.vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:44:40Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.716 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.718 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.720 183134 DEBUG nova.virt.libvirt.vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:44:40Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.720 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.722 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.723 183134 DEBUG nova.objects.instance [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.739 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <uuid>468f5a89-b848-45a6-8649-d09040ab2a09</uuid>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <name>instance-0000002d</name>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-320003657</nova:name>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:44:50</nova:creationTime>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:port uuid="57979c3b-0d30-474f-b9e6-16ccca270fbf">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        <nova:port uuid="4088bc52-1be0-4b2d-91f5-7a4615232b92">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe46:a916" ipVersion="6"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="serial">468f5a89-b848-45a6-8649-d09040ab2a09</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="uuid">468f5a89-b848-45a6-8649-d09040ab2a09</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.config"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:45:cf:e2"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <target dev="tap57979c3b-0d"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:46:a9:16"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <target dev="tap4088bc52-1b"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/console.log" append="off"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:44:50 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:44:50 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:44:50 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:44:50 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.741 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Preparing to wait for external event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.741 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.741 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.741 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.742 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Preparing to wait for external event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.742 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.742 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.742 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.743 183134 DEBUG nova.virt.libvirt.vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:44:40Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.743 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.744 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.744 183134 DEBUG os_vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.745 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.745 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.746 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.750 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.750 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57979c3b-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.751 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57979c3b-0d, col_values=(('external_ids', {'iface-id': '57979c3b-0d30-474f-b9e6-16ccca270fbf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:cf:e2', 'vm-uuid': '468f5a89-b848-45a6-8649-d09040ab2a09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.752 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 NetworkManager[55565]: <info>  [1769766290.7536] manager: (tap57979c3b-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.754 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.759 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.760 183134 INFO os_vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d')#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.761 183134 DEBUG nova.virt.libvirt.vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:44:40Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.762 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.762 183134 DEBUG nova.network.os_vif_util [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.763 183134 DEBUG os_vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.763 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.763 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.764 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.766 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.767 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4088bc52-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.767 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4088bc52-1b, col_values=(('external_ids', {'iface-id': '4088bc52-1be0-4b2d-91f5-7a4615232b92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:a9:16', 'vm-uuid': '468f5a89-b848-45a6-8649-d09040ab2a09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.768 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 NetworkManager[55565]: <info>  [1769766290.7694] manager: (tap4088bc52-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.770 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.774 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.775 183134 INFO os_vif [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b')#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.887 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.888 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.888 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:45:cf:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.888 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:46:a9:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:44:50 np0005601977 nova_compute[183130]: 2026-01-30 09:44:50.889 183134 INFO nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Using config drive#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.223 183134 INFO nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Creating config drive at /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.config#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.227 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuw2h54q9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.355 183134 DEBUG oslo_concurrency.processutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuw2h54q9" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:44:51 np0005601977 kernel: tap57979c3b-0d: entered promiscuous mode
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4177] manager: (tap57979c3b-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00458|binding|INFO|Claiming lport 57979c3b-0d30-474f-b9e6-16ccca270fbf for this chassis.
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00459|binding|INFO|57979c3b-0d30-474f-b9e6-16ccca270fbf: Claiming fa:16:3e:45:cf:e2 10.100.0.12
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.419 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.428 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4322] manager: (tap4088bc52-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 30 04:44:51 np0005601977 kernel: tap4088bc52-1b: entered promiscuous mode
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.434 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.437 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00460|if_status|INFO|Not updating pb chassis for 4088bc52-1be0-4b2d-91f5-7a4615232b92 now as sb is readonly
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00461|binding|INFO|Claiming lport 4088bc52-1be0-4b2d-91f5-7a4615232b92 for this chassis.
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00462|binding|INFO|4088bc52-1be0-4b2d-91f5-7a4615232b92: Claiming fa:16:3e:46:a9:16 2001:db8::f816:3eff:fe46:a916
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.451 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:cf:e2 10.100.0.12'], port_security=['fa:16:3e:45:cf:e2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-718822f1-f31b-43f7-81ad-7c257e53efa2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '734be60e-cf29-48be-bfcc-a2e866fbc7f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da6afba3-4dcc-4b99-bb94-fa9ef8e0909a, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=57979c3b-0d30-474f-b9e6-16ccca270fbf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.453 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 57979c3b-0d30-474f-b9e6-16ccca270fbf in datapath 718822f1-f31b-43f7-81ad-7c257e53efa2 bound to our chassis#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.456 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 718822f1-f31b-43f7-81ad-7c257e53efa2#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.457 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 systemd-udevd[226265]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:44:51 np0005601977 systemd-udevd[226267]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00463|binding|INFO|Setting lport 57979c3b-0d30-474f-b9e6-16ccca270fbf ovn-installed in OVS
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00464|binding|INFO|Setting lport 4088bc52-1be0-4b2d-91f5-7a4615232b92 ovn-installed in OVS
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.462 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.469 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc0c764-717d-4a6a-9c8e-51b3722875eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.470 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap718822f1-f1 in ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4724] device (tap57979c3b-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4733] device (tap4088bc52-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.472 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap718822f1-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.473 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[10ccafa8-2cc1-4cef-9cd1-62f4d01671f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4742] device (tap57979c3b-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.4747] device (tap4088bc52-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.474 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f2cacf4c-6ddd-4cc0-9f92-b70f1abb592d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 systemd-machined[154431]: New machine qemu-37-instance-0000002d.
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.483 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[57ad0c5f-3a8c-45e4-8722-ab76de2cc189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 systemd[1]: Started Virtual Machine qemu-37-instance-0000002d.
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.495 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c97fbce0-b877-4db6-8ba5-9856590f4424]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00465|binding|INFO|Setting lport 4088bc52-1be0-4b2d-91f5-7a4615232b92 up in Southbound
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00466|binding|INFO|Setting lport 57979c3b-0d30-474f-b9e6-16ccca270fbf up in Southbound
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.503 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:a9:16 2001:db8::f816:3eff:fe46:a916'], port_security=['fa:16:3e:46:a9:16 2001:db8::f816:3eff:fe46:a916'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe46:a916/64', 'neutron:device_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '734be60e-cf29-48be-bfcc-a2e866fbc7f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=323b9e02-2128-4b84-8cbe-de6525d1728d, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4088bc52-1be0-4b2d-91f5-7a4615232b92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.516 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[431e5e6c-0878-4d12-89ca-fd6d3435a97b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.5226] manager: (tap718822f1-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.522 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[da516d61-0a31-4c5d-b1ee-4e8fcef3b3fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.547 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d479d98a-c387-400a-99f9-d383d83a121a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.550 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[02fafd7f-fb01-4de9-80b6-708f027844fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.5649] device (tap718822f1-f0): carrier: link connected
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.567 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[50d96a0e-f503-4bf1-818e-6b113f9676ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.578 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8323acf1-47b8-4ad5-8891-d11522b4fc76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap718822f1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:b1:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482290, 'reachable_time': 25787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226302, 'error': None, 'target': 'ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.587 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6c9a60-86de-4715-8730-c9cfe26b17ac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:b124'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482290, 'tstamp': 482290}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226303, 'error': None, 'target': 'ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.598 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf7ec6b-f373-40a9-b4fe-9b118bfb89a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap718822f1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:b1:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482290, 'reachable_time': 25787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226304, 'error': None, 'target': 'ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.613 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[93540a50-294e-45bf-85bf-dc22262a646e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.648 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[41d1e182-6d16-498c-991f-e9e747e7fcf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.649 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap718822f1-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.650 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.650 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap718822f1-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.652 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 kernel: tap718822f1-f0: entered promiscuous mode
Jan 30 04:44:51 np0005601977 NetworkManager[55565]: <info>  [1769766291.6560] manager: (tap718822f1-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.656 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.660 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap718822f1-f0, col_values=(('external_ids', {'iface-id': 'd6e15ff1-8451-4134-9247-4d8c23ead538'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.661 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:51Z|00467|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.663 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.663 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/718822f1-f31b-43f7-81ad-7c257e53efa2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/718822f1-f31b-43f7-81ad-7c257e53efa2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.666 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[96b0f922-3782-4042-806e-89d5cb06d877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.667 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-718822f1-f31b-43f7-81ad-7c257e53efa2
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/718822f1-f31b-43f7-81ad-7c257e53efa2.pid.haproxy
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 718822f1-f31b-43f7-81ad-7c257e53efa2
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:44:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:51.668 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2', 'env', 'PROCESS_TAG=haproxy-718822f1-f31b-43f7-81ad-7c257e53efa2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/718822f1-f31b-43f7-81ad-7c257e53efa2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.669 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.720 183134 DEBUG nova.network.neutron [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated VIF entry in instance network info cache for port 4088bc52-1be0-4b2d-91f5-7a4615232b92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.721 183134 DEBUG nova.network.neutron [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:44:51 np0005601977 nova_compute[183130]: 2026-01-30 09:44:51.762 183134 DEBUG oslo_concurrency.lockutils [req-36c7616c-00fe-4389-8628-456fa2c8aa1b req-6a35517d-6cbd-4df2-aef8-cbd56f939b56 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:44:51 np0005601977 podman[226336]: 2026-01-30 09:44:51.996154959 +0000 UTC m=+0.045989581 container create 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:44:52 np0005601977 systemd[1]: Started libpod-conmon-844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60.scope.
Jan 30 04:44:52 np0005601977 podman[226336]: 2026-01-30 09:44:51.971358198 +0000 UTC m=+0.021192800 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:44:52 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:44:52 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c04f847a71c873f5657082cc6afeb02521acfc38de06995fe48efc99b0bfac1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:44:52 np0005601977 podman[226336]: 2026-01-30 09:44:52.086888953 +0000 UTC m=+0.136723555 container init 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 30 04:44:52 np0005601977 podman[226336]: 2026-01-30 09:44:52.093453791 +0000 UTC m=+0.143288413 container start 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:44:52 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [NOTICE]   (226356) : New worker (226358) forked
Jan 30 04:44:52 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [NOTICE]   (226356) : Loading success.
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.159 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4088bc52-1be0-4b2d-91f5-7a4615232b92 in datapath d8a742aa-08a4-4990-8e09-fbcff59d9bd9 unbound from our chassis#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.161 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8a742aa-08a4-4990-8e09-fbcff59d9bd9#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.169 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c9956a9a-8022-48dc-b4c4-32b838f43c04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.169 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8a742aa-01 in ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.171 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8a742aa-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.171 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[76851e76-046b-40d6-92ab-227459dff68b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.172 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4f61c7-c11b-49f5-9392-1da687eb01ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.180 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[73d726a8-0e74-435b-b4fd-d768f8ddebac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.226 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.230 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7596f410-60cc-4cb9-93bf-2c06833c985c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.252 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[db28547c-daa9-4e98-aebf-3099964ee9aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 NetworkManager[55565]: <info>  [1769766292.2578] manager: (tapd8a742aa-00): new Veth device (/org/freedesktop/NetworkManager/Devices/195)
Jan 30 04:44:52 np0005601977 systemd-udevd[226296]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.256 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[47f72eed-4133-48f0-84fa-bfcc79ddf8e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.284 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb546bf-1ab9-49aa-882b-0a7c4689c50d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.289 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b6fd83c4-cecf-43a3-a181-a2fd1959b6f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 NetworkManager[55565]: <info>  [1769766292.3106] device (tapd8a742aa-00): carrier: link connected
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.317 183134 DEBUG nova.compute.manager [req-916ca7b8-e25b-4970-b94f-e53836379946 req-b4ce0027-854d-455c-8acb-a7d5d52d031a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.316 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d62001ac-c33e-46c1-8aa3-eed49d0f2881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.317 183134 DEBUG oslo_concurrency.lockutils [req-916ca7b8-e25b-4970-b94f-e53836379946 req-b4ce0027-854d-455c-8acb-a7d5d52d031a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.318 183134 DEBUG oslo_concurrency.lockutils [req-916ca7b8-e25b-4970-b94f-e53836379946 req-b4ce0027-854d-455c-8acb-a7d5d52d031a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.318 183134 DEBUG oslo_concurrency.lockutils [req-916ca7b8-e25b-4970-b94f-e53836379946 req-b4ce0027-854d-455c-8acb-a7d5d52d031a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.318 183134 DEBUG nova.compute.manager [req-916ca7b8-e25b-4970-b94f-e53836379946 req-b4ce0027-854d-455c-8acb-a7d5d52d031a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Processing event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.321 183134 DEBUG nova.compute.manager [req-1fed2771-77c1-48cc-996b-c00f59116577 req-d8933e80-dd63-480c-9a73-9ce982a9eda6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.322 183134 DEBUG oslo_concurrency.lockutils [req-1fed2771-77c1-48cc-996b-c00f59116577 req-d8933e80-dd63-480c-9a73-9ce982a9eda6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.322 183134 DEBUG oslo_concurrency.lockutils [req-1fed2771-77c1-48cc-996b-c00f59116577 req-d8933e80-dd63-480c-9a73-9ce982a9eda6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.322 183134 DEBUG oslo_concurrency.lockutils [req-1fed2771-77c1-48cc-996b-c00f59116577 req-d8933e80-dd63-480c-9a73-9ce982a9eda6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.323 183134 DEBUG nova.compute.manager [req-1fed2771-77c1-48cc-996b-c00f59116577 req-d8933e80-dd63-480c-9a73-9ce982a9eda6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Processing event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.331 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3552ee-43dd-4df8-aaad-f0d760b563e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a742aa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482365, 'reachable_time': 24305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226377, 'error': None, 'target': 'ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.343 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b77bb753-08e7-4aff-bfe7-a82b3480b43a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:5f6b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482365, 'tstamp': 482365}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226378, 'error': None, 'target': 'ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.359 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b375083f-48bf-4175-8f78-370f7f8febbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a742aa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482365, 'reachable_time': 24305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226379, 'error': None, 'target': 'ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.390 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c29a26-a56b-4d33-aee3-fb78109f4fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.414 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[224190bd-a4f5-4d77-a749-10406b80315e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.415 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a742aa-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.416 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.416 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8a742aa-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.418 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:52 np0005601977 kernel: tapd8a742aa-00: entered promiscuous mode
Jan 30 04:44:52 np0005601977 NetworkManager[55565]: <info>  [1769766292.4198] manager: (tapd8a742aa-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.421 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8a742aa-00, col_values=(('external_ids', {'iface-id': 'b9267414-60b5-4998-b9cd-8b1c6a718595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:52Z|00468|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.427 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.428 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8a742aa-08a4-4990-8e09-fbcff59d9bd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8a742aa-08a4-4990-8e09-fbcff59d9bd9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.429 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6dcd6222-f1d2-4349-af23-5cce78bfcd6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.430 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-d8a742aa-08a4-4990-8e09-fbcff59d9bd9
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/d8a742aa-08a4-4990-8e09-fbcff59d9bd9.pid.haproxy
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID d8a742aa-08a4-4990-8e09-fbcff59d9bd9
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:44:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:52.430 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'env', 'PROCESS_TAG=haproxy-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8a742aa-08a4-4990-8e09-fbcff59d9bd9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.536 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.537 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766292.5360441, 468f5a89-b848-45a6-8649-d09040ab2a09 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.537 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] VM Started (Lifecycle Event)#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.544 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.547 183134 INFO nova.virt.libvirt.driver [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance spawned successfully.#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.548 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.597 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.600 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.649 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.650 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.651 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.652 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.653 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.654 183134 DEBUG nova.virt.libvirt.driver [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.723 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.724 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766292.5371017, 468f5a89-b848-45a6-8649-d09040ab2a09 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.724 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:44:52 np0005601977 podman[226417]: 2026-01-30 09:44:52.763743461 +0000 UTC m=+0.051519560 container create f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:44:52 np0005601977 systemd[1]: Started libpod-conmon-f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29.scope.
Jan 30 04:44:52 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:44:52 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb8fd1413ef152a32d8424edf21c0f5217ec02708cd30b904b6bafde678d9f25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:44:52 np0005601977 podman[226417]: 2026-01-30 09:44:52.738136036 +0000 UTC m=+0.025912135 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:44:52 np0005601977 podman[226417]: 2026-01-30 09:44:52.839403083 +0000 UTC m=+0.127179192 container init f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:44:52 np0005601977 podman[226417]: 2026-01-30 09:44:52.843885511 +0000 UTC m=+0.131661640 container start f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:44:52 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [NOTICE]   (226436) : New worker (226438) forked
Jan 30 04:44:52 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [NOTICE]   (226436) : Loading success.
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.867 183134 INFO nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Took 12.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.868 183134 DEBUG nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.904 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.908 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766292.5414023, 468f5a89-b848-45a6-8649-d09040ab2a09 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:44:52 np0005601977 nova_compute[183130]: 2026-01-30 09:44:52.908 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:44:53 np0005601977 nova_compute[183130]: 2026-01-30 09:44:53.017 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:44:53 np0005601977 nova_compute[183130]: 2026-01-30 09:44:53.027 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:44:53 np0005601977 nova_compute[183130]: 2026-01-30 09:44:53.042 183134 INFO nova.compute.manager [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Took 13.03 seconds to build instance.#033[00m
Jan 30 04:44:53 np0005601977 nova_compute[183130]: 2026-01-30 09:44:53.074 183134 DEBUG oslo_concurrency.lockutils [None req-93beda37-8447-48b4-9176-2a1049b6be2e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.410 183134 DEBUG nova.compute.manager [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.412 183134 DEBUG oslo_concurrency.lockutils [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.413 183134 DEBUG oslo_concurrency.lockutils [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.413 183134 DEBUG oslo_concurrency.lockutils [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.414 183134 DEBUG nova.compute.manager [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.414 183134 WARNING nova.compute.manager [req-06bac703-776f-427b-891c-8b33e2521706 req-a171c252-fa2b-4ee2-8c36-9e3c85a9dcca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received unexpected event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.428 183134 DEBUG nova.compute.manager [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.429 183134 DEBUG oslo_concurrency.lockutils [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.430 183134 DEBUG oslo_concurrency.lockutils [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.431 183134 DEBUG oslo_concurrency.lockutils [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.431 183134 DEBUG nova.compute.manager [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:44:54 np0005601977 nova_compute[183130]: 2026-01-30 09:44:54.432 183134 WARNING nova.compute.manager [req-8a6705c6-d811-4b9e-be50-e7bbadb98465 req-36b02178-e943-42c6-9bd1-e063a526cfae dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received unexpected event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf for instance with vm_state active and task_state None.#033[00m
Jan 30 04:44:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:55.439 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:44:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:55.441 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:44:55 np0005601977 nova_compute[183130]: 2026-01-30 09:44:55.442 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:55 np0005601977 nova_compute[183130]: 2026-01-30 09:44:55.790 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:56 np0005601977 nova_compute[183130]: 2026-01-30 09:44:56.955 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:56 np0005601977 NetworkManager[55565]: <info>  [1769766296.9566] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 30 04:44:56 np0005601977 NetworkManager[55565]: <info>  [1769766296.9572] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 30 04:44:56 np0005601977 nova_compute[183130]: 2026-01-30 09:44:56.987 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:56Z|00469|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:44:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:44:56Z|00470|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.047 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.229 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.284 183134 DEBUG nova.compute.manager [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.285 183134 DEBUG nova.compute.manager [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing instance network info cache due to event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.286 183134 DEBUG oslo_concurrency.lockutils [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.286 183134 DEBUG oslo_concurrency.lockutils [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:44:57 np0005601977 nova_compute[183130]: 2026-01-30 09:44:57.287 183134 DEBUG nova.network.neutron [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing network info cache for port 57979c3b-0d30-474f-b9e6-16ccca270fbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:44:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:57.397 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:57.398 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:44:57.399 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:59 np0005601977 nova_compute[183130]: 2026-01-30 09:44:59.558 183134 DEBUG nova.network.neutron [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated VIF entry in instance network info cache for port 57979c3b-0d30-474f-b9e6-16ccca270fbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:44:59 np0005601977 nova_compute[183130]: 2026-01-30 09:44:59.560 183134 DEBUG nova.network.neutron [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:44:59 np0005601977 nova_compute[183130]: 2026-01-30 09:44:59.586 183134 DEBUG oslo_concurrency.lockutils [req-92cbf080-bd4f-4bbf-8e56-08fe308dbaac req-c32ef971-61cd-4e2c-9957-db1e16fbb0cb dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:44:59 np0005601977 podman[226448]: 2026-01-30 09:44:59.858092473 +0000 UTC m=+0.068765805 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:45:00 np0005601977 nova_compute[183130]: 2026-01-30 09:45:00.791 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:02 np0005601977 nova_compute[183130]: 2026-01-30 09:45:02.231 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:04Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:cf:e2 10.100.0.12
Jan 30 04:45:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:04Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:cf:e2 10.100.0.12
Jan 30 04:45:05 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:05.444 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:05 np0005601977 nova_compute[183130]: 2026-01-30 09:45:05.794 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:07 np0005601977 nova_compute[183130]: 2026-01-30 09:45:07.274 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.372 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.372 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.373 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:45:09 np0005601977 podman[226484]: 2026-01-30 09:45:09.495953573 +0000 UTC m=+0.075843988 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., version=9.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, vcs-type=git, managed_by=edpm_ansible, release=1769056855, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.503 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:09 np0005601977 podman[226485]: 2026-01-30 09:45:09.516326838 +0000 UTC m=+0.087215365 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.575 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.576 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.638 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.864 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.866 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5489MB free_disk=73.21949768066406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.866 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.867 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.933 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.933 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.933 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.971 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:45:09 np0005601977 nova_compute[183130]: 2026-01-30 09:45:09.987 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:45:10 np0005601977 nova_compute[183130]: 2026-01-30 09:45:10.014 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:45:10 np0005601977 nova_compute[183130]: 2026-01-30 09:45:10.014 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:10 np0005601977 nova_compute[183130]: 2026-01-30 09:45:10.833 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:12 np0005601977 nova_compute[183130]: 2026-01-30 09:45:12.276 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:14 np0005601977 podman[226530]: 2026-01-30 09:45:14.823002026 +0000 UTC m=+0.043494759 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:45:14 np0005601977 podman[226531]: 2026-01-30 09:45:14.828301028 +0000 UTC m=+0.045405914 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:45:15 np0005601977 nova_compute[183130]: 2026-01-30 09:45:15.837 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:17 np0005601977 nova_compute[183130]: 2026-01-30 09:45:17.277 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:18 np0005601977 nova_compute[183130]: 2026-01-30 09:45:18.015 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:18 np0005601977 nova_compute[183130]: 2026-01-30 09:45:18.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.580 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.580 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.581 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:45:19 np0005601977 nova_compute[183130]: 2026-01-30 09:45:19.581 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.839 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:20 np0005601977 podman[226573]: 2026-01-30 09:45:20.863808936 +0000 UTC m=+0.079861674 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.868 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.868 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.885 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.957 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.957 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.965 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:45:20 np0005601977 nova_compute[183130]: 2026-01-30 09:45:20.965 183134 INFO nova.compute.claims [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.077 183134 DEBUG nova.compute.provider_tree [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.093 183134 DEBUG nova.scheduler.client.report [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.114 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.115 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.153 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.154 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.170 183134 INFO nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.185 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.274 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.275 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.276 183134 INFO nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Creating image(s)#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.276 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.277 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.277 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.296 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.359 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.361 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.362 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.380 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.415 183134 DEBUG nova.policy [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.427 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.428 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.453 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.454 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.455 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.500 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.502 183134 DEBUG nova.virt.disk.api [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.502 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.553 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.554 183134 DEBUG nova.virt.disk.api [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.555 183134 DEBUG nova.objects.instance [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 22257a3d-81ea-4635-a3b3-c2d6695610fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.583 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.584 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Ensure instance console log exists: /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.584 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.585 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:21 np0005601977 nova_compute[183130]: 2026-01-30 09:45:21.585 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:22 np0005601977 nova_compute[183130]: 2026-01-30 09:45:22.279 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.351 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Successfully created port: 7334f7f7-6252-4aca-8bec-af4bec57bacd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.814 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.894 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.895 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.895 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.895 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.896 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.896 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:23 np0005601977 nova_compute[183130]: 2026-01-30 09:45:23.896 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:45:24 np0005601977 nova_compute[183130]: 2026-01-30 09:45:24.776 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Successfully updated port: 7334f7f7-6252-4aca-8bec-af4bec57bacd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:45:24 np0005601977 nova_compute[183130]: 2026-01-30 09:45:24.800 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:45:24 np0005601977 nova_compute[183130]: 2026-01-30 09:45:24.801 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:45:24 np0005601977 nova_compute[183130]: 2026-01-30 09:45:24.801 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:45:25 np0005601977 nova_compute[183130]: 2026-01-30 09:45:25.409 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:45:25 np0005601977 nova_compute[183130]: 2026-01-30 09:45:25.841 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.436 183134 DEBUG nova.network.neutron [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updating instance_info_cache with network_info: [{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.604 183134 DEBUG nova.compute.manager [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.605 183134 DEBUG nova.compute.manager [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing instance network info cache due to event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.605 183134 DEBUG oslo_concurrency.lockutils [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.642 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.643 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Instance network_info: |[{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.643 183134 DEBUG oslo_concurrency.lockutils [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.644 183134 DEBUG nova.network.neutron [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.647 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Start _get_guest_xml network_info=[{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.653 183134 WARNING nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.657 183134 DEBUG nova.virt.libvirt.host [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.658 183134 DEBUG nova.virt.libvirt.host [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.661 183134 DEBUG nova.virt.libvirt.host [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.662 183134 DEBUG nova.virt.libvirt.host [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.664 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.664 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.665 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.665 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.665 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.665 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.666 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.666 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.666 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.666 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.666 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.667 183134 DEBUG nova.virt.hardware [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.672 183134 DEBUG nova.virt.libvirt.vif [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:45:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-414017162',display_name='tempest-TestNetworkBasicOps-server-414017162',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-414017162',id=47,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7u/zck9RgprU+Y3YnA2bCIqvkfD1ap714bw6i8E8rYZ2NRSYp8uofp1yF0TaYfoaRcBHVp/jA+XklGxN4OU0MpZMoZrrlhNIx4e14K3CIwumlYBFtyvuscuIoCk7KZUg==',key_name='tempest-TestNetworkBasicOps-172188972',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-l3nynb42',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:45:21Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=22257a3d-81ea-4635-a3b3-c2d6695610fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.673 183134 DEBUG nova.network.os_vif_util [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.674 183134 DEBUG nova.network.os_vif_util [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.674 183134 DEBUG nova.objects.instance [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 22257a3d-81ea-4635-a3b3-c2d6695610fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.744 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <uuid>22257a3d-81ea-4635-a3b3-c2d6695610fc</uuid>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <name>instance-0000002f</name>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-414017162</nova:name>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:45:26</nova:creationTime>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        <nova:port uuid="7334f7f7-6252-4aca-8bec-af4bec57bacd">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="serial">22257a3d-81ea-4635-a3b3-c2d6695610fc</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="uuid">22257a3d-81ea-4635-a3b3-c2d6695610fc</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.config"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:82:8c:54"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <target dev="tap7334f7f7-62"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/console.log" append="off"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:45:26 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:45:26 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:45:26 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:45:26 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.745 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Preparing to wait for external event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.745 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.745 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.745 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.746 183134 DEBUG nova.virt.libvirt.vif [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:45:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-414017162',display_name='tempest-TestNetworkBasicOps-server-414017162',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-414017162',id=47,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7u/zck9RgprU+Y3YnA2bCIqvkfD1ap714bw6i8E8rYZ2NRSYp8uofp1yF0TaYfoaRcBHVp/jA+XklGxN4OU0MpZMoZrrlhNIx4e14K3CIwumlYBFtyvuscuIoCk7KZUg==',key_name='tempest-TestNetworkBasicOps-172188972',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-l3nynb42',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:45:21Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=22257a3d-81ea-4635-a3b3-c2d6695610fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.746 183134 DEBUG nova.network.os_vif_util [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.746 183134 DEBUG nova.network.os_vif_util [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.747 183134 DEBUG os_vif [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.747 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.748 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.748 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.750 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.750 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7334f7f7-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.750 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7334f7f7-62, col_values=(('external_ids', {'iface-id': '7334f7f7-6252-4aca-8bec-af4bec57bacd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:8c:54', 'vm-uuid': '22257a3d-81ea-4635-a3b3-c2d6695610fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.752 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:26 np0005601977 NetworkManager[55565]: <info>  [1769766326.7534] manager: (tap7334f7f7-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.754 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.764 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.766 183134 INFO os_vif [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62')#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.881 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.882 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.882 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:82:8c:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:45:26 np0005601977 nova_compute[183130]: 2026-01-30 09:45:26.883 183134 INFO nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Using config drive#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.294 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.538 183134 INFO nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Creating config drive at /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.config#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.544 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8d4bpslo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.661 183134 DEBUG oslo_concurrency.processutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8d4bpslo" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:45:27 np0005601977 kernel: tap7334f7f7-62: entered promiscuous mode
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.7197] manager: (tap7334f7f7-62): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Jan 30 04:45:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:27Z|00471|binding|INFO|Claiming lport 7334f7f7-6252-4aca-8bec-af4bec57bacd for this chassis.
Jan 30 04:45:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:27Z|00472|binding|INFO|7334f7f7-6252-4aca-8bec-af4bec57bacd: Claiming fa:16:3e:82:8c:54 10.100.0.3
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.721 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.741 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:8c:54 10.100.0.3'], port_security=['fa:16:3e:82:8c:54 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-507230f8-8e41-47dc-b940-3f8b7601fa57', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91d771d2-9575-4f5c-855f-5bd2e916c9e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b180ff09-b150-4828-9ed2-8fd824c54036, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=7334f7f7-6252-4aca-8bec-af4bec57bacd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.743 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 7334f7f7-6252-4aca-8bec-af4bec57bacd in datapath 507230f8-8e41-47dc-b940-3f8b7601fa57 bound to our chassis#033[00m
Jan 30 04:45:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:27Z|00473|binding|INFO|Setting lport 7334f7f7-6252-4aca-8bec-af4bec57bacd ovn-installed in OVS
Jan 30 04:45:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:27Z|00474|binding|INFO|Setting lport 7334f7f7-6252-4aca-8bec-af4bec57bacd up in Southbound
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.745 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.745 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 507230f8-8e41-47dc-b940-3f8b7601fa57#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.750 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.760 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[162ee08e-012f-4be5-b7c1-cf30a0d56e8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.761 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap507230f8-81 in ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.763 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap507230f8-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.763 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b4eb912a-44bc-47d1-b519-a4506ded68ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.765 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6fa17d-afc7-4e03-b3ee-43310c03e38c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 systemd-udevd[226634]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:45:27 np0005601977 systemd-machined[154431]: New machine qemu-38-instance-0000002f.
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.7766] device (tap7334f7f7-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.7773] device (tap7334f7f7-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.775 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[107026d5-1cb0-4f5b-8c1e-3487e2d323fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 systemd[1]: Started Virtual Machine qemu-38-instance-0000002f.
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.797 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[23f1329e-1889-43e6-94cb-6d2cbd8c0614]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.819 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[227ab4e7-d70e-4481-81b3-fcd4f98f8537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.824 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[08811ef5-01ef-4a32-a4c0-3fcbca7ba2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.8253] manager: (tap507230f8-80): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 30 04:45:27 np0005601977 systemd-udevd[226637]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.844 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c4e909-3a27-47d7-9944-13ee5d4f2eda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.848 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ef291d-39ab-434d-bb4c-1300328ae1a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.8642] device (tap507230f8-80): carrier: link connected
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.867 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6941b4-f386-44df-8a60-c4fbee74c23f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.886 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a2dd5c2f-e045-4636-bca9-5a456d8b9b36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap507230f8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:07:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485920, 'reachable_time': 16679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226666, 'error': None, 'target': 'ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.895 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e42b98-e775-45d9-ab54-2e47dff48cbf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9a:73b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485920, 'tstamp': 485920}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226667, 'error': None, 'target': 'ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.910 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b7945b-fc72-4516-bf74-33cae6d5f339]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap507230f8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:07:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485920, 'reachable_time': 16679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226668, 'error': None, 'target': 'ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.932 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb94010-c28a-45ea-91e4-bfc528cda667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.975 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f1383d-0724-4ee2-92e6-ba385e0fae4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.977 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap507230f8-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.977 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.978 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap507230f8-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.981 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 kernel: tap507230f8-80: entered promiscuous mode
Jan 30 04:45:27 np0005601977 NetworkManager[55565]: <info>  [1769766327.9818] manager: (tap507230f8-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.983 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.991 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap507230f8-80, col_values=(('external_ids', {'iface-id': 'e01c03ac-3241-43fd-9749-69da38b9c6c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.993 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:27Z|00475|binding|INFO|Releasing lport e01c03ac-3241-43fd-9749-69da38b9c6c2 from this chassis (sb_readonly=0)
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.994 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:27.997 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/507230f8-8e41-47dc-b940-3f8b7601fa57.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/507230f8-8e41-47dc-b940-3f8b7601fa57.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:45:27 np0005601977 nova_compute[183130]: 2026-01-30 09:45:27.999 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:28.001 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f4fffd08-6ed9-46b4-a49a-93620d8e9010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:28.002 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-507230f8-8e41-47dc-b940-3f8b7601fa57
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/507230f8-8e41-47dc-b940-3f8b7601fa57.pid.haproxy
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 507230f8-8e41-47dc-b940-3f8b7601fa57
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:45:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:28.004 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57', 'env', 'PROCESS_TAG=haproxy-507230f8-8e41-47dc-b940-3f8b7601fa57', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/507230f8-8e41-47dc-b940-3f8b7601fa57.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:45:28 np0005601977 podman[226700]: 2026-01-30 09:45:28.326024608 +0000 UTC m=+0.041141222 container create 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:45:28 np0005601977 systemd[1]: Started libpod-conmon-7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537.scope.
Jan 30 04:45:28 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:45:28 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27a9bec462368ff477c62b204c019c0654d536c795d4d88833435aa7c6ff5eb9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:45:28 np0005601977 podman[226700]: 2026-01-30 09:45:28.304381127 +0000 UTC m=+0.019497751 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:45:28 np0005601977 podman[226700]: 2026-01-30 09:45:28.413678304 +0000 UTC m=+0.128794928 container init 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:45:28 np0005601977 podman[226700]: 2026-01-30 09:45:28.418807411 +0000 UTC m=+0.133924025 container start 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:45:28 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [NOTICE]   (226720) : New worker (226722) forked
Jan 30 04:45:28 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [NOTICE]   (226720) : Loading success.
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.516 183134 DEBUG nova.compute.manager [req-602390ee-3315-4c94-9fce-5d04b795da7c req-3ea352f3-55b9-4aa7-9ff0-b7d5a68c7281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.516 183134 DEBUG oslo_concurrency.lockutils [req-602390ee-3315-4c94-9fce-5d04b795da7c req-3ea352f3-55b9-4aa7-9ff0-b7d5a68c7281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.516 183134 DEBUG oslo_concurrency.lockutils [req-602390ee-3315-4c94-9fce-5d04b795da7c req-3ea352f3-55b9-4aa7-9ff0-b7d5a68c7281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.517 183134 DEBUG oslo_concurrency.lockutils [req-602390ee-3315-4c94-9fce-5d04b795da7c req-3ea352f3-55b9-4aa7-9ff0-b7d5a68c7281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.517 183134 DEBUG nova.compute.manager [req-602390ee-3315-4c94-9fce-5d04b795da7c req-3ea352f3-55b9-4aa7-9ff0-b7d5a68c7281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Processing event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.621 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766328.6207557, 22257a3d-81ea-4635-a3b3-c2d6695610fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.622 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] VM Started (Lifecycle Event)#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.626 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.634 183134 DEBUG nova.network.neutron [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updated VIF entry in instance network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.635 183134 DEBUG nova.network.neutron [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updating instance_info_cache with network_info: [{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.651 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.666 183134 DEBUG oslo_concurrency.lockutils [req-b41efe3b-f52f-42b7-8adf-d7dd497a37e9 req-1af9c558-3de7-438b-a68f-0553567c0a7c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.676 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.680 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.685 183134 INFO nova.virt.libvirt.driver [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Instance spawned successfully.#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.686 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.704 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.704 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766328.6209164, 22257a3d-81ea-4635-a3b3-c2d6695610fc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.704 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.719 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.719 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.720 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.720 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.721 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.721 183134 DEBUG nova.virt.libvirt.driver [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.746 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.749 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766328.6756983, 22257a3d-81ea-4635-a3b3-c2d6695610fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.749 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.799 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.802 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.813 183134 INFO nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Took 7.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.813 183134 DEBUG nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.862 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.961 183134 INFO nova.compute.manager [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Took 8.03 seconds to build instance.#033[00m
Jan 30 04:45:28 np0005601977 nova_compute[183130]: 2026-01-30 09:45:28.984 183134 DEBUG oslo_concurrency.lockutils [None req-9196ce7a-51a0-43e3-9b33-e33841d0b558 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.749 183134 DEBUG nova.compute.manager [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.750 183134 DEBUG oslo_concurrency.lockutils [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.750 183134 DEBUG oslo_concurrency.lockutils [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.750 183134 DEBUG oslo_concurrency.lockutils [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.751 183134 DEBUG nova.compute.manager [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] No waiting events found dispatching network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.751 183134 WARNING nova.compute.manager [req-073aa454-f1fc-4020-8b5a-a662811acbbd req-37bc8153-9d6b-4505-906c-3865eb1016c0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received unexpected event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd for instance with vm_state active and task_state None.#033[00m
Jan 30 04:45:30 np0005601977 podman[226738]: 2026-01-30 09:45:30.832996357 +0000 UTC m=+0.054871386 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.892 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:30 np0005601977 nova_compute[183130]: 2026-01-30 09:45:30.892 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:31 np0005601977 nova_compute[183130]: 2026-01-30 09:45:31.754 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:32 np0005601977 nova_compute[183130]: 2026-01-30 09:45:32.330 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:35 np0005601977 nova_compute[183130]: 2026-01-30 09:45:35.634 183134 DEBUG nova.compute.manager [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:45:35 np0005601977 nova_compute[183130]: 2026-01-30 09:45:35.634 183134 DEBUG nova.compute.manager [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing instance network info cache due to event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:45:35 np0005601977 nova_compute[183130]: 2026-01-30 09:45:35.635 183134 DEBUG oslo_concurrency.lockutils [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:45:35 np0005601977 nova_compute[183130]: 2026-01-30 09:45:35.635 183134 DEBUG oslo_concurrency.lockutils [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:45:35 np0005601977 nova_compute[183130]: 2026-01-30 09:45:35.635 183134 DEBUG nova.network.neutron [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:45:36 np0005601977 nova_compute[183130]: 2026-01-30 09:45:36.757 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:37 np0005601977 nova_compute[183130]: 2026-01-30 09:45:37.371 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:37 np0005601977 nova_compute[183130]: 2026-01-30 09:45:37.406 183134 DEBUG nova.network.neutron [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updated VIF entry in instance network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:45:37 np0005601977 nova_compute[183130]: 2026-01-30 09:45:37.407 183134 DEBUG nova.network.neutron [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updating instance_info_cache with network_info: [{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:45:37 np0005601977 nova_compute[183130]: 2026-01-30 09:45:37.472 183134 DEBUG oslo_concurrency.lockutils [req-2f2179f8-ddb7-4327-9069-37f81e067ea8 req-ca19f9af-7fb2-43ed-bc09-eae6350e168d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:45:39 np0005601977 podman[226781]: 2026-01-30 09:45:39.857930123 +0000 UTC m=+0.069476945 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:45:39 np0005601977 podman[226780]: 2026-01-30 09:45:39.866458348 +0000 UTC m=+0.085784443 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, release=1769056855, version=9.7, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:45:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:40Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:8c:54 10.100.0.3
Jan 30 04:45:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:45:40Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:8c:54 10.100.0.3
Jan 30 04:45:41 np0005601977 nova_compute[183130]: 2026-01-30 09:45:41.759 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:42 np0005601977 nova_compute[183130]: 2026-01-30 09:45:42.372 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:45 np0005601977 podman[226817]: 2026-01-30 09:45:45.835609695 +0000 UTC m=+0.047725751 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:45:45 np0005601977 podman[226818]: 2026-01-30 09:45:45.849923585 +0000 UTC m=+0.063803152 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:45:46 np0005601977 nova_compute[183130]: 2026-01-30 09:45:46.761 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:47 np0005601977 nova_compute[183130]: 2026-01-30 09:45:47.375 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:51 np0005601977 nova_compute[183130]: 2026-01-30 09:45:51.763 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:51 np0005601977 podman[226862]: 2026-01-30 09:45:51.850440722 +0000 UTC m=+0.072139212 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:45:52 np0005601977 nova_compute[183130]: 2026-01-30 09:45:52.378 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'name': 'tempest-TestNetworkBasicOps-server-414017162', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '33bba0bc2a744596b558c6598a1970de', 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'hostId': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.457 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'name': 'tempest-TestGettingAddress-server-320003657', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'hostId': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.486 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.486 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.520 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 1063 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.521 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d3704d8-00e2-4642-9a1f-19a408ec57ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.458370', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '790c0c04-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '750ae7cfb0c54807c206ea6d60f5ac7665ec38477758c378ba388181a7432f52'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.458370', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '790c2086-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '2d42ec037895507cafe990f6398f69aa6604f3f535fd5a789bfdd5a34a50368b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1063, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.458370', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79114642-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': 'd802c32e906f7afc775c3a6350a1460482a13cf1644f53f38d471cd4b8ed3860'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.458370', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79115fa6-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': 'f8888c54fec4c8c49b03131a2f9b67f7b985cdf15520ab9f359dfbd402b07af4'}]}, 'timestamp': '2026-01-30 09:45:55.521782', '_unique_id': '203944650e2241dd8ca84691793b50f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.540 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.540 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.553 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.554 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72140378-d4dc-471f-8bd3-67419a96e147', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.525926', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79144a68-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': 'b69b461c95a17d4401938150c8dbf9851547ff7a8c4a5d8ff8a83c886e01c040'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.525926', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7914603e-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': '00950920ba0e5b178e601b0424c003d60fdbeab6eff1710a3cf28582d8500fa9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.525926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '791661e0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': '339cd87364bafda957d77377bb785be10bb94a9e1a6d51d80a70162b8568741c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.525926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79167cac-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': '52c182ffb5997444b2a57e7acb97b815d1520ec665e325fb704765d1d772ae38'}]}, 'timestamp': '2026-01-30 09:45:55.555373', '_unique_id': '4beebc9a9ce2496e93d8e77fb96c85b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.559 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.563 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 22257a3d-81ea-4635-a3b3-c2d6695610fc / tap7334f7f7-62 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.563 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.568 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 468f5a89-b848-45a6-8649-d09040ab2a09 / tap57979c3b-0d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.568 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 468f5a89-b848-45a6-8649-d09040ab2a09 / tap4088bc52-1b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.569 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.569 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e991fb68-fa13-4df7-b174-8aab2c4895bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.559677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '7917e9d4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '0b65c03900ad2a931ca99dbee7b7202ef6e9edee78056ddf597df302620cf71b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.559677', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '7918b530-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'bb1124fa28d1c460056162423fbe81dd4a4cde1bd59649d00cde8a234acd1a67'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.559677', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '7918c50c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'add50bf269cdc9991044acebe0b3b3515d701df541eca5509e7c147a129964c5'}]}, 'timestamp': '2026-01-30 09:45:55.570221', '_unique_id': '274d35d7379743b1b0c9504e1fdb42fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.573 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.573 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.573 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52e81fc1-3d2d-421d-aa9f-57d786f0ebff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.573087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '7919498c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '79381eefc9ddd0ffbf711193ac95db162d6c023cb9eb4a150cdb729ae5554666'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.573087', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '7919574c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '07db711d8fe6fc0d7491f7c626a446def0460fdb423940d6e4e9d5a61448dd14'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.573087', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '79196778-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'd52b28ebc4c17e558425f807c7de6bd801d7feaaaf0d08f4de72ff17c6141e30'}]}, 'timestamp': '2026-01-30 09:45:55.574358', '_unique_id': 'da64f60025bf487a8cdfbc9c1fd719dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.577 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.577 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.578 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f64dffea-8fbe-457e-858e-c23edacdf5d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.577054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '7919ee00-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '17611f0d01d58c3a0425e29ce193cf007fbde14b6d87b8463047902760861800'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.577054', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '791a0afc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'e00c55cf368c62b8622033d6c53e8ef4628358f07b10058e7099b34cfee59277'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.577054', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '791a21e0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '9bd9f30ef8411b000fdf892e1f75fd0d8cb5844bd3f0313d6255462322e86aa9'}]}, 'timestamp': '2026-01-30 09:45:55.579214', '_unique_id': '4050a1b2637c40e78e7303071f547f3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.582 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.603 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/memory.usage volume: 42.64453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.625 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/memory.usage volume: 43.66015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1078d05-f668-4f9c-9ff3-d5dc7538ea98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.64453125, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'timestamp': '2026-01-30T09:45:55.582986', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '791df036-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.998149871, 'message_signature': 'ab6bdde6064ee6964a39a59f63e7d66bab2d0aa317c1f836c5b8d6accf7883e4'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.66015625, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:45:55.582986', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '792158ca-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4887.020559164, 'message_signature': '8662f79c88db362b0d1b76a2de2f3c50ea4474a3c6907a0b3359bf476c6420b4'}]}, 'timestamp': '2026-01-30 09:45:55.626501', '_unique_id': 'd45385a4f1c942f3b2e88696a13d7ce8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.627 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.629 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.630 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>]
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.630 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.630 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.latency volume: 1941377064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.631 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.631 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 1625763224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.632 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cef3aec6-6f0d-4ace-9843-f226a236a607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1941377064, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.630721', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7922145e-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '0625f4f211f3bbf4ab5576d8614f207e8023791fc94adb28c7f10c7d8adca7d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.630721', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79222d7c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '5186bcfa435da60b3acb15aafddcd30443945c11fbddc6b19d33b6ed5d0b7c4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1625763224, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.630721', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79224000-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '5fe48b0c1d116921782578bd3fac2b0c24419910f0dce306c0c5485b2d25c4be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.630721', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7922540a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': 'f23cbe76e6139a94329e8173382ccec435d8e0409168c23ebd81b0e679012ea2'}]}, 'timestamp': '2026-01-30 09:45:55.632850', '_unique_id': '762667355758466993a7bb14a2987bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.633 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.635 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.636 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.636 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0807e272-eaa6-4d39-b673-d220c2f47d55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.635682', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '7922d6aa-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '584e0120f6e8118340d501bfbfdd6bd195e3bc6a8ee8be5dfa7540d6cc527455'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.635682', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '7922ebea-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '5d47e4de16f39198626918b3c572bb3d3dfa5b11f254663448199e8a8a9d7ce0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.635682', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '7922fea0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '1ab647e030da96d272be87409d3d13a2da6dd2a58d8edacfc0779366a2d23c6c'}]}, 'timestamp': '2026-01-30 09:45:55.637290', '_unique_id': '90fad805f47148ae95485761c2d5243a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.640 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.640 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.641 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7423aa69-0cf1-4f98-bdd5-df8fb5f133a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.640026', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '79238186-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '8712e8738ffc1b5d898ecb81b20ce4242370ad9db40f6b81ad02618d92f308f7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.640026', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '7923950e-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '693336d23c619ae49655c79df91edbf10befbca0280344ff414b78a87d06ceaa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.640026', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '7923a9fe-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '2b202ed16a1c1fad72cd10ee53ce81522e3b9687ca2063626cbff939eb05b1ba'}]}, 'timestamp': '2026-01-30 09:45:55.641619', '_unique_id': '59a30f9212204f28bc6ae0c7b67a9615'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.644 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.644 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.645 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62d65d6c-66a6-4b0a-ba09-edf59984b23a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.644324', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '79242802-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': 'b600bc5b8336ad00238bcd4a8bff8402e4ff59977aa549be1cb0135365c7711f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.644324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '79243b58-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'aa620cd720480e62624964834f6426adc3b99f208edaefa37c9622cdc6f2b015'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.644324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '792450b6-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '2e3158b48f691f524ca8046fdb6f4e74800768440e375ad31e6535323d4feb41'}]}, 'timestamp': '2026-01-30 09:45:55.645885', '_unique_id': '44f2f9eeb4a145a3a3a2e91372a72e10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.648 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.649 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.649 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>]
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.650 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.651 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.651 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.652 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3e7d19d-8d5b-4b1c-9022-e47b39064b17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.650309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79251686-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': 'ac7d2ea4409a6248a0aa942c06ccabf1d018cab3877f80dd2400e48f803506f7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.650309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792532ec-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': 'a0d6a489996da23d69e6d2755d15146350578d65e4cfe09b136bc43bae15282b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.650309', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79254a0c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': 'd2067fcd4fb05e7d1d62b53827d3728e9a4ad23bf141b1f86ebfe7eeeb735053'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.650309', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79255ff6-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': '6a41dbb4fd499b53a6955bad850cb99ac42c03da6b1e6d541135685dd04b9b97'}]}, 'timestamp': '2026-01-30 09:45:55.652835', '_unique_id': '9da238e727734a2f9f07e4b93cae0ee4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.656 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.656 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>]
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.656 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.656 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.657 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.657 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '489b568c-9f93-4f2d-89c8-d48ab0207f55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.656712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '79260a14-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': 'bdf7a7b1df78d9faf41bbd74b329c049801d764fc4f8775f8c18339056a8edbf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.656712', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '7926203a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'a58c6029b12f5687feff1a040df323c14980ae449ea39cfca81befb8da030e5f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.656712', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '792630fc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '89ad0ae3c249b90f9a69baab61a92dbf97618e960295949bb077c4ebc4bbf1c6'}]}, 'timestamp': '2026-01-30 09:45:55.658233', '_unique_id': '71ab8468c99b4b8aaab8075b1bad220b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.659 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.660 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.661 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.661 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.662 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed81dc0c-d95a-4f48-8017-766b11ca3236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.660655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7926a49c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '774e26fa5a35fe1eff1ad89ac8cd271f45ea9043b6bd938f9563eaa671d5dec3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.660655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7926baf4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': 'acb2a4a0c5b382c3522a4f31a9369141ff0725ffc56c5f351f87d3a158e55401'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 336, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.660655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7926cb5c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '70731c9b5d4be348ed5f3d96513ba1f8464a2ecc116603d726b489603a53c0a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.660655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7926e060-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '0a67db7b61e14fb26785980ff86e4f9fa27ec3dd4a5606af016b101435b653c9'}]}, 'timestamp': '2026-01-30 09:45:55.662665', '_unique_id': 'c42c74bb5dc24c0faddbc2d92e3ddeee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.665 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.666 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.666 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41144779-8841-4930-bbbd-e649497678b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.665645', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '79276a1c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '98bd070b10cf2458b91328874eb22cef5569bd41f989eee5db9c88965e01dbdf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.665645', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '79277e3a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '4e20b254efbf4f7053401df0d6db087d0cfb534d851ce85f6be0c3bc2abb7504'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.665645', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '79279168-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'ef9dc916bdee677cd16851dec92f69de0b78b8c5915873f96f685deeaaa19731'}]}, 'timestamp': '2026-01-30 09:45:55.667305', '_unique_id': '78667efc171c41edac74f982d8f73de1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.668 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.670 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.bytes volume: 72937472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.670 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.671 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.671 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6d78478-7d19-45e4-b4ba-ca815f765512', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72937472, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.670050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79281534-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': 'db2840ef3bd0d14408f62281bb414113714d0bc58119859b64778f895f400323'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.670050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79282a56-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '59f54cad3906ea7cec07e9818ddfa97b3ed65bba58845930008899b1e1cba27d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.670050', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79283e42-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '928d8d963dade67048dbb715da13d846a15726c5c6b052e743db82edb6c13e0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.670050', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79285288-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '0da639d59623f7d5c18c8c9a82bb2936e66916d042c7d4d178e4eda01baa2a9e'}]}, 'timestamp': '2026-01-30 09:45:55.672158', '_unique_id': '6d4f8b5c0f924604a6d0d044908ffa4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.673 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.674 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.675 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/cpu volume: 10190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.675 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/cpu volume: 10940000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8db10d0-cc3f-4519-b005-36715848e86f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10190000000, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'timestamp': '2026-01-30T09:45:55.674956', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7928d7bc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.998149871, 'message_signature': 'ed7854562cfcd264dee6597f2f0c0c8cebcddecf87091f5470f9084ffd1b315b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10940000000, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:45:55.674956', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7928ee96-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4887.020559164, 'message_signature': '25fcd5b3acb12988fd4225530d3d005719c19cfa7c17ba814b6e552b6361fdbc'}]}, 'timestamp': '2026-01-30 09:45:55.676255', '_unique_id': '70bf2229c5cc4656bf3f188aff3251c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.678 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.679 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.679 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 2708 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68fe0dd0-bf51-4152-9a1f-d2aaefe1f5a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.678756', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '79296704-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': '5b76372f0ca95a4950074f3fa7f5fc9564067e8e7406dd9e2dc10a2624dc2bd3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.678756', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '79297a82-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '767582d1cf9e679cb3fb62dbd9bf42eed12125e6500fc46c0ecc9a68bba0b32b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2708, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.678756', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '79298b1c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': '58b778e2af897adca9a63d1c1a7dd38db307f14a68c45bc937cb7f929718e21f'}]}, 'timestamp': '2026-01-30 09:45:55.680146', '_unique_id': '22bd805f1bdc49fdbf959da62efc7507'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.681 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.682 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.682 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-414017162>, <NovaLikeServer: tempest-TestGettingAddress-server-320003657>]
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.682 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.683 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.latency volume: 424160546 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.683 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.latency volume: 40518814 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.683 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 489043333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.684 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 43086902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '093b947b-dcf2-4b83-b0c5-5d73284196ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 424160546, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.683088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792a1122-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': 'b031281e55852a02568d21cf38d5636611bd30abeddcf179410a6a7aec0759b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40518814, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.683088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792a21c6-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': 'b526a0238a78a20f0610c721da615999ac990cd57dda73947cabebd5a68a1189'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 489043333, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.683088', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792a347c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '5c062dbe90bf0239333dce556982f90838283007789544be05c46d0f229b6b00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43086902, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.683088', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792a4534-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '6598308dce98b476d56ad0a207cfb88eaf91f69233e2ed95dec537d0bbfb98cb'}]}, 'timestamp': '2026-01-30 09:45:55.684896', '_unique_id': '1bde3b236f514e51beb8913d35546df8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.686 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.687 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.688 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.688 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 29415936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.689 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6260219f-e20a-4722-97ee-a6d8c293646c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.687703', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792ac5f4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': '9e0103ba0da8a92a97b6113909befe55501ac2bdf95de169271e1c5cc718a352'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.687703', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792ada08-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.853263563, 'message_signature': 'd8e130332d85fc4a65d410751e24c0b1640d29445bd5ad5b01608817aaab8c93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29415936, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.687703', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792aedae-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': 'e52b8bf5c5d6c7f15e08f966ec44d03adb3c249e12519ffa5bdb6ea0ffa92510'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.687703', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792b064a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.882413049, 'message_signature': '6c47df13ba5a53f60531e1d544b6e8b54a4140a05e3753f756f99804c639bf7d'}]}, 'timestamp': '2026-01-30 09:45:55.689936', '_unique_id': 'c1a8c592918e4b74af6e509eef08ff8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.690 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.691 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.691 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.692 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.692 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.692 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80bbd6af-9f25-40b0-8531-8e27ef557551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-vda', 'timestamp': '2026-01-30T09:45:55.691882', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792b6504-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': 'a00b87d8072f94052cbdc60231e832c237e0fd2ada46542c8021632baa321df9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc-sda', 'timestamp': '2026-01-30T09:45:55.691882', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'instance-0000002f', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792b7166-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.920877743, 'message_signature': '200784d58e0430841474df7b6d5be1906940cac26116b1368d797abfc97ca5cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:45:55.691882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '792b7bca-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': '497e15969ce6682e4e545de70f00c0c2872b7cd20f8178a65b738d323d86df66'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:45:55.691882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '792b8566-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.936328517, 'message_signature': '397d8e0263bcf4447d73f2f4f109be19a3d95207e0aa19248a3efa0ffee0c8d9'}]}, 'timestamp': '2026-01-30 09:45:55.693013', '_unique_id': 'fc8af2840f7b4deaa2177606c382b2c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.693 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.694 12 DEBUG ceilometer.compute.pollsters [-] 22257a3d-81ea-4635-a3b3-c2d6695610fc/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.695 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.695 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6255b74e-1aeb-404b-9dc6-184463a9d7fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_name': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_name': None, 'resource_id': 'instance-0000002f-22257a3d-81ea-4635-a3b3-c2d6695610fc-tap7334f7f7-62', 'timestamp': '2026-01-30T09:45:55.694902', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-414017162', 'name': 'tap7334f7f7-62', 'instance_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'instance_type': 'm1.nano', 'host': '5fdb5b5a111f03269a3fc13ffa11a2c2f896142cacabd116b46b1e9e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:8c:54', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7334f7f7-62'}, 'message_id': '792bdbd8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.954614442, 'message_signature': 'e3ce3c85e5fb3bef2f94b6c8164b068cc03b35adf42bafa2dce79dda42ad8aef'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:45:55.694902', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '792bec54-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'd6612955a365227b046f7718ec49e8e1d82321e6ec990f6c4a25198a8253141f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 740, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:45:55.694902', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '792bfac8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 4886.959507592, 'message_signature': 'd3820109d5d00da5bca29c9f7aa7e852d98fd9a9399d10a988cd6eb3f9398b77'}]}, 'timestamp': '2026-01-30 09:45:55.696098', '_unique_id': '59056699c3b547d79fb9435d231278b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:45:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:45:55.696 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:45:56 np0005601977 nova_compute[183130]: 2026-01-30 09:45:56.765 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:57 np0005601977 nova_compute[183130]: 2026-01-30 09:45:57.379 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:45:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:57.399 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:57.399 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:45:57.400 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:01 np0005601977 nova_compute[183130]: 2026-01-30 09:46:01.767 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:01 np0005601977 podman[226888]: 2026-01-30 09:46:01.833399154 +0000 UTC m=+0.050340766 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:46:02 np0005601977 nova_compute[183130]: 2026-01-30 09:46:02.382 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:05 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:05Z|00476|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 30 04:46:06 np0005601977 nova_compute[183130]: 2026-01-30 09:46:06.771 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:07 np0005601977 nova_compute[183130]: 2026-01-30 09:46:07.429 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.492 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.492 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.493 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.493 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.589 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.667 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.669 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.715 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.720 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.774 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.775 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:46:09 np0005601977 nova_compute[183130]: 2026-01-30 09:46:09.848 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.015 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.017 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5343MB free_disk=73.19081497192383GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.018 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.018 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.124 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.125 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 22257a3d-81ea-4635-a3b3-c2d6695610fc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.125 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.126 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.194 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.208 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.230 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:46:10 np0005601977 nova_compute[183130]: 2026-01-30 09:46:10.231 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:10 np0005601977 podman[226925]: 2026-01-30 09:46:10.843969505 +0000 UTC m=+0.061438404 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 30 04:46:10 np0005601977 podman[226926]: 2026-01-30 09:46:10.856007941 +0000 UTC m=+0.074546241 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:46:11 np0005601977 nova_compute[183130]: 2026-01-30 09:46:11.772 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:12 np0005601977 nova_compute[183130]: 2026-01-30 09:46:12.432 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:16 np0005601977 nova_compute[183130]: 2026-01-30 09:46:16.775 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:16 np0005601977 podman[226967]: 2026-01-30 09:46:16.862324032 +0000 UTC m=+0.065782729 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:46:16 np0005601977 podman[226968]: 2026-01-30 09:46:16.862328322 +0000 UTC m=+0.066422507 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:46:17 np0005601977 nova_compute[183130]: 2026-01-30 09:46:17.434 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:20 np0005601977 nova_compute[183130]: 2026-01-30 09:46:20.230 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:20 np0005601977 nova_compute[183130]: 2026-01-30 09:46:20.231 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:20 np0005601977 nova_compute[183130]: 2026-01-30 09:46:20.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:20 np0005601977 nova_compute[183130]: 2026-01-30 09:46:20.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.545 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.546 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.546 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.547 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:46:21 np0005601977 nova_compute[183130]: 2026-01-30 09:46:21.817 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:22 np0005601977 nova_compute[183130]: 2026-01-30 09:46:22.436 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:22 np0005601977 podman[227008]: 2026-01-30 09:46:22.951887172 +0000 UTC m=+0.170776523 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.502 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.526 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.526 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.526 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.527 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:25 np0005601977 nova_compute[183130]: 2026-01-30 09:46:25.527 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:26 np0005601977 nova_compute[183130]: 2026-01-30 09:46:26.819 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:27 np0005601977 nova_compute[183130]: 2026-01-30 09:46:27.438 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:31 np0005601977 nova_compute[183130]: 2026-01-30 09:46:31.821 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:32 np0005601977 nova_compute[183130]: 2026-01-30 09:46:32.440 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:32 np0005601977 podman[227036]: 2026-01-30 09:46:32.865893916 +0000 UTC m=+0.075726815 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:46:34 np0005601977 nova_compute[183130]: 2026-01-30 09:46:34.522 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:35.606 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:46:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:35.607 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:46:35 np0005601977 nova_compute[183130]: 2026-01-30 09:46:35.607 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:36 np0005601977 nova_compute[183130]: 2026-01-30 09:46:36.823 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:37 np0005601977 nova_compute[183130]: 2026-01-30 09:46:37.482 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:37 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:37.609 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:46:41 np0005601977 nova_compute[183130]: 2026-01-30 09:46:41.825 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:41 np0005601977 podman[227063]: 2026-01-30 09:46:41.84331365 +0000 UTC m=+0.059118728 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 30 04:46:41 np0005601977 podman[227062]: 2026-01-30 09:46:41.849046975 +0000 UTC m=+0.062174716 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git)
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.485 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.608 183134 DEBUG nova.compute.manager [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.609 183134 DEBUG nova.compute.manager [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing instance network info cache due to event network-changed-7334f7f7-6252-4aca-8bec-af4bec57bacd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.609 183134 DEBUG oslo_concurrency.lockutils [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.609 183134 DEBUG oslo_concurrency.lockutils [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.610 183134 DEBUG nova.network.neutron [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Refreshing network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.935 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.935 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.935 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.936 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.936 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.937 183134 INFO nova.compute.manager [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Terminating instance#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.938 183134 DEBUG nova.compute.manager [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:46:42 np0005601977 kernel: tap7334f7f7-62 (unregistering): left promiscuous mode
Jan 30 04:46:42 np0005601977 NetworkManager[55565]: <info>  [1769766402.9658] device (tap7334f7f7-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.974 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:42Z|00477|binding|INFO|Releasing lport 7334f7f7-6252-4aca-8bec-af4bec57bacd from this chassis (sb_readonly=0)
Jan 30 04:46:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:42Z|00478|binding|INFO|Setting lport 7334f7f7-6252-4aca-8bec-af4bec57bacd down in Southbound
Jan 30 04:46:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:42Z|00479|binding|INFO|Removing iface tap7334f7f7-62 ovn-installed in OVS
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.981 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:42 np0005601977 nova_compute[183130]: 2026-01-30 09:46:42.987 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:42.992 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:8c:54 10.100.0.3'], port_security=['fa:16:3e:82:8c:54 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '22257a3d-81ea-4635-a3b3-c2d6695610fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-507230f8-8e41-47dc-b940-3f8b7601fa57', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': '91d771d2-9575-4f5c-855f-5bd2e916c9e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b180ff09-b150-4828-9ed2-8fd824c54036, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=7334f7f7-6252-4aca-8bec-af4bec57bacd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:46:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:42.993 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 7334f7f7-6252-4aca-8bec-af4bec57bacd in datapath 507230f8-8e41-47dc-b940-3f8b7601fa57 unbound from our chassis#033[00m
Jan 30 04:46:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:42.994 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 507230f8-8e41-47dc-b940-3f8b7601fa57, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:46:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:42.996 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f02941ca-6c57-4798-b692-708f95a4ea27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:42.996 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57 namespace which is not needed anymore#033[00m
Jan 30 04:46:43 np0005601977 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 30 04:46:43 np0005601977 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000002f.scope: Consumed 14.194s CPU time.
Jan 30 04:46:43 np0005601977 systemd-machined[154431]: Machine qemu-38-instance-0000002f terminated.
Jan 30 04:46:43 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [NOTICE]   (226720) : haproxy version is 2.8.14-c23fe91
Jan 30 04:46:43 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [NOTICE]   (226720) : path to executable is /usr/sbin/haproxy
Jan 30 04:46:43 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [WARNING]  (226720) : Exiting Master process...
Jan 30 04:46:43 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [ALERT]    (226720) : Current worker (226722) exited with code 143 (Terminated)
Jan 30 04:46:43 np0005601977 neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57[226716]: [WARNING]  (226720) : All workers exited. Exiting... (0)
Jan 30 04:46:43 np0005601977 systemd[1]: libpod-7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537.scope: Deactivated successfully.
Jan 30 04:46:43 np0005601977 podman[227128]: 2026-01-30 09:46:43.132209395 +0000 UTC m=+0.055495934 container died 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.155 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.160 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537-userdata-shm.mount: Deactivated successfully.
Jan 30 04:46:43 np0005601977 systemd[1]: var-lib-containers-storage-overlay-27a9bec462368ff477c62b204c019c0654d536c795d4d88833435aa7c6ff5eb9-merged.mount: Deactivated successfully.
Jan 30 04:46:43 np0005601977 podman[227128]: 2026-01-30 09:46:43.176094105 +0000 UTC m=+0.099380634 container cleanup 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:46:43 np0005601977 systemd[1]: libpod-conmon-7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537.scope: Deactivated successfully.
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.184 183134 INFO nova.virt.libvirt.driver [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Instance destroyed successfully.#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.185 183134 DEBUG nova.objects.instance [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 22257a3d-81ea-4635-a3b3-c2d6695610fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.199 183134 DEBUG nova.virt.libvirt.vif [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:45:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-414017162',display_name='tempest-TestNetworkBasicOps-server-414017162',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-414017162',id=47,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7u/zck9RgprU+Y3YnA2bCIqvkfD1ap714bw6i8E8rYZ2NRSYp8uofp1yF0TaYfoaRcBHVp/jA+XklGxN4OU0MpZMoZrrlhNIx4e14K3CIwumlYBFtyvuscuIoCk7KZUg==',key_name='tempest-TestNetworkBasicOps-172188972',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:45:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-l3nynb42',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:45:28Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=22257a3d-81ea-4635-a3b3-c2d6695610fc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.199 183134 DEBUG nova.network.os_vif_util [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.200 183134 DEBUG nova.network.os_vif_util [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.201 183134 DEBUG os_vif [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.202 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.202 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7334f7f7-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.203 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.204 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.206 183134 INFO os_vif [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:8c:54,bridge_name='br-int',has_traffic_filtering=True,id=7334f7f7-6252-4aca-8bec-af4bec57bacd,network=Network(507230f8-8e41-47dc-b940-3f8b7601fa57),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7334f7f7-62')#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.207 183134 INFO nova.virt.libvirt.driver [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Deleting instance files /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc_del#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.208 183134 INFO nova.virt.libvirt.driver [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Deletion of /var/lib/nova/instances/22257a3d-81ea-4635-a3b3-c2d6695610fc_del complete#033[00m
Jan 30 04:46:43 np0005601977 podman[227174]: 2026-01-30 09:46:43.233736239 +0000 UTC m=+0.043026706 container remove 7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.237 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[906ff21e-cb6f-4740-916f-82efa1888cd9]: (4, ('Fri Jan 30 09:46:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57 (7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537)\n7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537\nFri Jan 30 09:46:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57 (7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537)\n7ee4864591b7932c18ac6310b81491ffa612541306e0c6c89e1e9b49b2989537\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.239 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c358a115-16ed-41a5-90f1-2f98e1a14f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.241 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap507230f8-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:46:43 np0005601977 kernel: tap507230f8-80: left promiscuous mode
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.246 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.250 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.252 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bde6f91b-4801-4d09-9b03-e119d2d3d1d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.267 183134 INFO nova.compute.manager [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.268 183134 DEBUG oslo.service.loopingcall [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.268 183134 DEBUG nova.compute.manager [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:46:43 np0005601977 nova_compute[183130]: 2026-01-30 09:46:43.268 183134 DEBUG nova.network.neutron [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.276 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[71f1705f-318c-45a9-93a6-a525e2bd7110]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.278 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[95cdfeb5-249b-41c4-b3ef-816ef59b5429]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.289 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[edde33ea-8bc4-4a74-9705-4fadc4875063]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485915, 'reachable_time': 26543, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227187, 'error': None, 'target': 'ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.292 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-507230f8-8e41-47dc-b940-3f8b7601fa57 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:46:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:43.292 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[80b6a94d-384d-474d-a41b-9b9983e6b971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:46:43 np0005601977 systemd[1]: run-netns-ovnmeta\x2d507230f8\x2d8e41\x2d47dc\x2db940\x2d3f8b7601fa57.mount: Deactivated successfully.
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.595 183134 DEBUG nova.network.neutron [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.618 183134 INFO nova.compute.manager [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Took 1.35 seconds to deallocate network for instance.#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.696 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.697 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.706 183134 DEBUG nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-vif-unplugged-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.707 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.707 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.708 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.708 183134 DEBUG nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] No waiting events found dispatching network-vif-unplugged-7334f7f7-6252-4aca-8bec-af4bec57bacd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.709 183134 WARNING nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received unexpected event network-vif-unplugged-7334f7f7-6252-4aca-8bec-af4bec57bacd for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.709 183134 DEBUG nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.710 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.710 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.710 183134 DEBUG oslo_concurrency.lockutils [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.711 183134 DEBUG nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] No waiting events found dispatching network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.711 183134 WARNING nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received unexpected event network-vif-plugged-7334f7f7-6252-4aca-8bec-af4bec57bacd for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.712 183134 DEBUG nova.compute.manager [req-2b904fc7-d0da-4215-a06f-3351bb8294f9 req-9f634fd6-31cc-4d7c-8871-052400daf953 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Received event network-vif-deleted-7334f7f7-6252-4aca-8bec-af4bec57bacd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.781 183134 DEBUG nova.compute.provider_tree [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.800 183134 DEBUG nova.scheduler.client.report [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.814 183134 DEBUG nova.network.neutron [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updated VIF entry in instance network info cache for port 7334f7f7-6252-4aca-8bec-af4bec57bacd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.814 183134 DEBUG nova.network.neutron [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Updating instance_info_cache with network_info: [{"id": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "address": "fa:16:3e:82:8c:54", "network": {"id": "507230f8-8e41-47dc-b940-3f8b7601fa57", "bridge": "br-int", "label": "tempest-network-smoke--2064106103", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7334f7f7-62", "ovs_interfaceid": "7334f7f7-6252-4aca-8bec-af4bec57bacd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.840 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.844 183134 DEBUG oslo_concurrency.lockutils [req-6c2a3002-63e1-47d0-8468-0294b8fa6d5d req-a46834f8-b380-43b7-aaac-2e6d1367b82d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-22257a3d-81ea-4635-a3b3-c2d6695610fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.866 183134 INFO nova.scheduler.client.report [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 22257a3d-81ea-4635-a3b3-c2d6695610fc#033[00m
Jan 30 04:46:44 np0005601977 nova_compute[183130]: 2026-01-30 09:46:44.935 183134 DEBUG oslo_concurrency.lockutils [None req-4d83792c-3739-40f5-83ae-02d6e07b3da4 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "22257a3d-81ea-4635-a3b3-c2d6695610fc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:47 np0005601977 nova_compute[183130]: 2026-01-30 09:46:47.488 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:47 np0005601977 podman[227189]: 2026-01-30 09:46:47.854260775 +0000 UTC m=+0.062675570 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:46:47 np0005601977 podman[227188]: 2026-01-30 09:46:47.861359459 +0000 UTC m=+0.069690772 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:46:48 np0005601977 nova_compute[183130]: 2026-01-30 09:46:48.204 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:52 np0005601977 nova_compute[183130]: 2026-01-30 09:46:52.546 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:52Z|00480|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:46:52 np0005601977 ovn_controller[95460]: 2026-01-30T09:46:52Z|00481|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:46:52 np0005601977 nova_compute[183130]: 2026-01-30 09:46:52.932 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:53 np0005601977 nova_compute[183130]: 2026-01-30 09:46:53.205 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:53 np0005601977 podman[227231]: 2026-01-30 09:46:53.879257843 +0000 UTC m=+0.091871948 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:46:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:57.399 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:57.400 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:46:57.400 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:57 np0005601977 nova_compute[183130]: 2026-01-30 09:46:57.549 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:58 np0005601977 nova_compute[183130]: 2026-01-30 09:46:58.182 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766403.1813061, 22257a3d-81ea-4635-a3b3-c2d6695610fc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:46:58 np0005601977 nova_compute[183130]: 2026-01-30 09:46:58.183 183134 INFO nova.compute.manager [-] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:46:58 np0005601977 nova_compute[183130]: 2026-01-30 09:46:58.236 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:46:58 np0005601977 nova_compute[183130]: 2026-01-30 09:46:58.284 183134 DEBUG nova.compute.manager [None req-33bed9e1-b26e-449a-a856-cee1213348ff - - - - - -] [instance: 22257a3d-81ea-4635-a3b3-c2d6695610fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:46:59 np0005601977 nova_compute[183130]: 2026-01-30 09:46:59.973 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:02 np0005601977 nova_compute[183130]: 2026-01-30 09:47:02.550 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:03 np0005601977 nova_compute[183130]: 2026-01-30 09:47:03.238 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:03 np0005601977 podman[227258]: 2026-01-30 09:47:03.577461585 +0000 UTC m=+0.044935451 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:47:05 np0005601977 nova_compute[183130]: 2026-01-30 09:47:05.929 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:07 np0005601977 nova_compute[183130]: 2026-01-30 09:47:07.437 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:07 np0005601977 nova_compute[183130]: 2026-01-30 09:47:07.590 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:08 np0005601977 nova_compute[183130]: 2026-01-30 09:47:08.240 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.386 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.407 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.408 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.408 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.409 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.488 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.563 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.564 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.615 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.768 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.770 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5471MB free_disk=73.2194709777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.770 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.771 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.844 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.845 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.845 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.896 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.910 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.928 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:47:10 np0005601977 nova_compute[183130]: 2026-01-30 09:47:10.929 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:12 np0005601977 nova_compute[183130]: 2026-01-30 09:47:12.625 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:12 np0005601977 podman[227290]: 2026-01-30 09:47:12.822971891 +0000 UTC m=+0.038661010 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 30 04:47:12 np0005601977 podman[227289]: 2026-01-30 09:47:12.839077724 +0000 UTC m=+0.055499634 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, container_name=openstack_network_exporter, io.openshift.expose-services=)
Jan 30 04:47:13 np0005601977 nova_compute[183130]: 2026-01-30 09:47:13.244 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:13 np0005601977 nova_compute[183130]: 2026-01-30 09:47:13.862 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:13 np0005601977 nova_compute[183130]: 2026-01-30 09:47:13.862 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:13 np0005601977 nova_compute[183130]: 2026-01-30 09:47:13.920 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.017 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.018 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.023 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.023 183134 INFO nova.compute.claims [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.176 183134 DEBUG nova.compute.provider_tree [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.203 183134 DEBUG nova.scheduler.client.report [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.251 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.252 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.299 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.299 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.338 183134 INFO nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.370 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.499 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.501 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.502 183134 INFO nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Creating image(s)#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.503 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.504 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.505 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.530 183134 DEBUG nova.policy [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.535 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.613 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.615 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.616 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.642 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.728 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.729 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.768 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.769 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.770 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.843 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.844 183134 DEBUG nova.virt.disk.api [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.844 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.897 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.898 183134 DEBUG nova.virt.disk.api [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.898 183134 DEBUG nova.objects.instance [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.925 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.925 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Ensure instance console log exists: /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.925 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.926 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:14 np0005601977 nova_compute[183130]: 2026-01-30 09:47:14.926 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:15 np0005601977 nova_compute[183130]: 2026-01-30 09:47:15.573 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Successfully created port: 029118c4-1eaa-455b-b4e6-533ad78399bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.728 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Successfully updated port: 029118c4-1eaa-455b-b4e6-533ad78399bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.747 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.748 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.748 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.851 183134 DEBUG nova.compute.manager [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.852 183134 DEBUG nova.compute.manager [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing instance network info cache due to event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.852 183134 DEBUG oslo_concurrency.lockutils [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:47:16 np0005601977 nova_compute[183130]: 2026-01-30 09:47:16.892 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.903 183134 DEBUG nova.network.neutron [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updating instance_info_cache with network_info: [{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.974 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.975 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Instance network_info: |[{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.976 183134 DEBUG oslo_concurrency.lockutils [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.976 183134 DEBUG nova.network.neutron [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.980 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Start _get_guest_xml network_info=[{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.986 183134 WARNING nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.991 183134 DEBUG nova.virt.libvirt.host [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.992 183134 DEBUG nova.virt.libvirt.host [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.994 183134 DEBUG nova.virt.libvirt.host [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.995 183134 DEBUG nova.virt.libvirt.host [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.996 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.996 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.997 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.997 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.997 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.997 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.997 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.998 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.998 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.998 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.998 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:47:17 np0005601977 nova_compute[183130]: 2026-01-30 09:47:17.998 183134 DEBUG nova.virt.hardware [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.002 183134 DEBUG nova.virt.libvirt.vif [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:47:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1470633199',display_name='tempest-TestNetworkBasicOps-server-1470633199',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1470633199',id=48,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDtLI36dTVl5yAjyfdZZELwDSZS9wXpCc6/82/IE7B7+EDztXf3ZURYQpFV26TZLdNnJDVIQMy00x+zHb2RtXAjDZ9E52Cat1/yiKqH/3YEKMozEBByujXO07DPs/YgxdA==',key_name='tempest-TestNetworkBasicOps-2080650735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-garak7p0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:47:14Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=61bbc8ff-22ea-4c6a-a9cb-7483e849e44d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.002 183134 DEBUG nova.network.os_vif_util [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.003 183134 DEBUG nova.network.os_vif_util [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.004 183134 DEBUG nova.objects.instance [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'pci_devices' on Instance uuid 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.246 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.263 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <uuid>61bbc8ff-22ea-4c6a-a9cb-7483e849e44d</uuid>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <name>instance-00000030</name>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestNetworkBasicOps-server-1470633199</nova:name>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:47:17</nova:creationTime>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:user uuid="a6dc933f15eb45099f724c1eb5d822e4">tempest-TestNetworkBasicOps-345324916-project-member</nova:user>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:project uuid="33bba0bc2a744596b558c6598a1970de">tempest-TestNetworkBasicOps-345324916</nova:project>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        <nova:port uuid="029118c4-1eaa-455b-b4e6-533ad78399bf">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="serial">61bbc8ff-22ea-4c6a-a9cb-7483e849e44d</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="uuid">61bbc8ff-22ea-4c6a-a9cb-7483e849e44d</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.config"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:ce:b2:18"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <target dev="tap029118c4-1e"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/console.log" append="off"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:47:18 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:47:18 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:47:18 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:47:18 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.264 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Preparing to wait for external event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.265 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.265 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.265 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.266 183134 DEBUG nova.virt.libvirt.vif [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:47:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1470633199',display_name='tempest-TestNetworkBasicOps-server-1470633199',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1470633199',id=48,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDtLI36dTVl5yAjyfdZZELwDSZS9wXpCc6/82/IE7B7+EDztXf3ZURYQpFV26TZLdNnJDVIQMy00x+zHb2RtXAjDZ9E52Cat1/yiKqH/3YEKMozEBByujXO07DPs/YgxdA==',key_name='tempest-TestNetworkBasicOps-2080650735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-garak7p0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:47:14Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=61bbc8ff-22ea-4c6a-a9cb-7483e849e44d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.266 183134 DEBUG nova.network.os_vif_util [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.267 183134 DEBUG nova.network.os_vif_util [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.267 183134 DEBUG os_vif [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.268 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.268 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.268 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.271 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.271 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap029118c4-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.271 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap029118c4-1e, col_values=(('external_ids', {'iface-id': '029118c4-1eaa-455b-b4e6-533ad78399bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:b2:18', 'vm-uuid': '61bbc8ff-22ea-4c6a-a9cb-7483e849e44d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:18 np0005601977 NetworkManager[55565]: <info>  [1769766438.2739] manager: (tap029118c4-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.278 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.279 183134 INFO os_vif [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e')#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.365 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.366 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.367 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] No VIF found with MAC fa:16:3e:ce:b2:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.367 183134 INFO nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Using config drive#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.726 183134 INFO nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Creating config drive at /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.config#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.730 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3cgkat27 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:47:18 np0005601977 podman[227352]: 2026-01-30 09:47:18.82465439 +0000 UTC m=+0.039020571 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 30 04:47:18 np0005601977 podman[227353]: 2026-01-30 09:47:18.832888046 +0000 UTC m=+0.040857724 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.844 183134 DEBUG oslo_concurrency.processutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3cgkat27" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:47:18 np0005601977 kernel: tap029118c4-1e: entered promiscuous mode
Jan 30 04:47:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:18Z|00482|binding|INFO|Claiming lport 029118c4-1eaa-455b-b4e6-533ad78399bf for this chassis.
Jan 30 04:47:18 np0005601977 NetworkManager[55565]: <info>  [1769766438.8945] manager: (tap029118c4-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Jan 30 04:47:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:18Z|00483|binding|INFO|029118c4-1eaa-455b-b4e6-533ad78399bf: Claiming fa:16:3e:ce:b2:18 10.100.0.12
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.894 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:18Z|00484|binding|INFO|Setting lport 029118c4-1eaa-455b-b4e6-533ad78399bf ovn-installed in OVS
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.900 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:18Z|00485|binding|INFO|Setting lport 029118c4-1eaa-455b-b4e6-533ad78399bf up in Southbound
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.903 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:b2:18 10.100.0.12'], port_security=['fa:16:3e:ce:b2:18 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61bbc8ff-22ea-4c6a-a9cb-7483e849e44d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e8e00093-d7b1-4a9a-82d3-105f1c49d3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d05f7c8-4e37-46be-ac35-803db3090f50, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=029118c4-1eaa-455b-b4e6-533ad78399bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.904 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 029118c4-1eaa-455b-b4e6-533ad78399bf in datapath b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 bound to our chassis#033[00m
Jan 30 04:47:18 np0005601977 nova_compute[183130]: 2026-01-30 09:47:18.904 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.905 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.916 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf0dcbf-3be9-4061-956b-f60bb5269d97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.917 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1c30d88-21 in ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.919 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1c30d88-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.919 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8667adfa-9f21-483c-9a38-439e0fff9be2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.920 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f239c8a2-4533-40ea-9a17-5d01c2f9f62f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 systemd-udevd[227409]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:47:18 np0005601977 systemd-machined[154431]: New machine qemu-39-instance-00000030.
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.929 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e4ccf1-e085-4c57-a6ae-2079cb3fa12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 NetworkManager[55565]: <info>  [1769766438.9384] device (tap029118c4-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:47:18 np0005601977 NetworkManager[55565]: <info>  [1769766438.9391] device (tap029118c4-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.942 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e8a584-4798-4312-b741-e34200b8bc2b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 systemd[1]: Started Virtual Machine qemu-39-instance-00000030.
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.964 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[34285922-b6a9-4c1b-9b74-a8d2a15783c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 NetworkManager[55565]: <info>  [1769766438.9697] manager: (tapb1c30d88-20): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.969 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdf3fe5-faec-4921-b793-87df14d3a985]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.990 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[39e4c438-f57e-4378-adc9-734c7ee68676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:18 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:18.994 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[97b3905c-7f86-4938-a01b-7fa986870b7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 NetworkManager[55565]: <info>  [1769766439.0126] device (tapb1c30d88-20): carrier: link connected
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.016 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[79c899cf-f7cc-4e68-84b0-25d40973e07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.032 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8da190cb-4a7b-45a7-947c-a13aaf19181b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1c30d88-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:ca:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497035, 'reachable_time': 39945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227441, 'error': None, 'target': 'ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.044 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6276eec6-92e9-4562-9789-08240b739511]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:ca2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497035, 'tstamp': 497035}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227442, 'error': None, 'target': 'ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.065 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5fdb6a-c3bd-47f2-abf6-47f451e49aac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1c30d88-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:ca:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497035, 'reachable_time': 39945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227443, 'error': None, 'target': 'ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.094 183134 DEBUG nova.compute.manager [req-c6c85147-46e0-42e0-b124-842c2ff95df6 req-2bddc3b0-981a-43f4-9e73-17b14bec3643 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.095 183134 DEBUG oslo_concurrency.lockutils [req-c6c85147-46e0-42e0-b124-842c2ff95df6 req-2bddc3b0-981a-43f4-9e73-17b14bec3643 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.096 183134 DEBUG oslo_concurrency.lockutils [req-c6c85147-46e0-42e0-b124-842c2ff95df6 req-2bddc3b0-981a-43f4-9e73-17b14bec3643 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.096 183134 DEBUG oslo_concurrency.lockutils [req-c6c85147-46e0-42e0-b124-842c2ff95df6 req-2bddc3b0-981a-43f4-9e73-17b14bec3643 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.097 183134 DEBUG nova.compute.manager [req-c6c85147-46e0-42e0-b124-842c2ff95df6 req-2bddc3b0-981a-43f4-9e73-17b14bec3643 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Processing event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.094 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[47e2bbbf-36ed-4ee8-9137-67c95444f13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.144 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[5f492bf5-d50b-43e8-bffe-63490c835375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.146 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1c30d88-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.146 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.147 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1c30d88-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.150 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:19 np0005601977 kernel: tapb1c30d88-20: entered promiscuous mode
Jan 30 04:47:19 np0005601977 NetworkManager[55565]: <info>  [1769766439.1515] manager: (tapb1c30d88-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.160 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1c30d88-20, col_values=(('external_ids', {'iface-id': '82892d82-7c51-4155-a9fb-5853c775bb76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.162 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:19Z|00486|binding|INFO|Releasing lport 82892d82-7c51-4155-a9fb-5853c775bb76 from this chassis (sb_readonly=0)
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.168 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.170 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.172 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[389b7be9-dbfc-4a17-ad7c-d0f8a7706639]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.173 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6.pid.haproxy
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:47:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:19.176 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'env', 'PROCESS_TAG=haproxy-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.223 183134 DEBUG nova.network.neutron [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updated VIF entry in instance network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.224 183134 DEBUG nova.network.neutron [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updating instance_info_cache with network_info: [{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.242 183134 DEBUG oslo_concurrency.lockutils [req-a30fac28-01e1-4200-b736-91e745fe6c2c req-4cb684bc-ef1c-40a0-b648-3faeab68bffe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:47:19 np0005601977 podman[227475]: 2026-01-30 09:47:19.514137301 +0000 UTC m=+0.039258418 container create 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:47:19 np0005601977 systemd[1]: Started libpod-conmon-4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51.scope.
Jan 30 04:47:19 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:47:19 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04733f326d411a168685c5f95941231ae4ea541e16fc88435e8ec5255fe5ac1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:47:19 np0005601977 podman[227475]: 2026-01-30 09:47:19.494465416 +0000 UTC m=+0.019586513 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:47:19 np0005601977 podman[227475]: 2026-01-30 09:47:19.592006766 +0000 UTC m=+0.117127893 container init 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 30 04:47:19 np0005601977 podman[227475]: 2026-01-30 09:47:19.596818074 +0000 UTC m=+0.121939171 container start 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 30 04:47:19 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [NOTICE]   (227495) : New worker (227497) forked
Jan 30 04:47:19 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [NOTICE]   (227495) : Loading success.
Jan 30 04:47:19 np0005601977 nova_compute[183130]: 2026-01-30 09:47:19.886 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.401 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766440.4013112, 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.402 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] VM Started (Lifecycle Event)#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.404 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.407 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.410 183134 INFO nova.virt.libvirt.driver [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Instance spawned successfully.#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.410 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.431 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.435 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.438 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.438 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.438 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.439 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.439 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.440 183134 DEBUG nova.virt.libvirt.driver [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.504 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.504 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766440.4022484, 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.504 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.544 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.548 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766440.406872, 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.549 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.556 183134 INFO nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Took 6.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.556 183134 DEBUG nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.579 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.582 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.614 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.636 183134 INFO nova.compute.manager [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Took 6.64 seconds to build instance.#033[00m
Jan 30 04:47:20 np0005601977 nova_compute[183130]: 2026-01-30 09:47:20.662 183134 DEBUG oslo_concurrency.lockutils [None req-c3e79252-433e-4253-9d29-58fd1a450f30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.168 183134 DEBUG nova.compute.manager [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.168 183134 DEBUG oslo_concurrency.lockutils [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.169 183134 DEBUG oslo_concurrency.lockutils [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.169 183134 DEBUG oslo_concurrency.lockutils [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.169 183134 DEBUG nova.compute.manager [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] No waiting events found dispatching network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.170 183134 WARNING nova.compute.manager [req-4649e054-aee1-4316-9823-dd5505e9e0aa req-21aff195-c86f-47e4-9d12-8924f4e1419a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received unexpected event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf for instance with vm_state active and task_state None.#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:21 np0005601977 nova_compute[183130]: 2026-01-30 09:47:21.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.358 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.359 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.386 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.387 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.388 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:22 np0005601977 nova_compute[183130]: 2026-01-30 09:47:22.677 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:23 np0005601977 nova_compute[183130]: 2026-01-30 09:47:23.273 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:23 np0005601977 nova_compute[183130]: 2026-01-30 09:47:23.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601977 nova_compute[183130]: 2026-01-30 09:47:24.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601977 nova_compute[183130]: 2026-01-30 09:47:24.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:47:24 np0005601977 podman[227513]: 2026-01-30 09:47:24.878340482 +0000 UTC m=+0.095718318 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:47:27 np0005601977 nova_compute[183130]: 2026-01-30 09:47:27.377 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:27 np0005601977 nova_compute[183130]: 2026-01-30 09:47:27.679 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.650 183134 DEBUG nova.compute.manager [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.652 183134 DEBUG nova.compute.manager [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing instance network info cache due to event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.653 183134 DEBUG oslo_concurrency.lockutils [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.653 183134 DEBUG oslo_concurrency.lockutils [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:47:28 np0005601977 nova_compute[183130]: 2026-01-30 09:47:28.654 183134 DEBUG nova.network.neutron [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:47:30 np0005601977 nova_compute[183130]: 2026-01-30 09:47:30.431 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:30 np0005601977 nova_compute[183130]: 2026-01-30 09:47:30.651 183134 DEBUG nova.network.neutron [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updated VIF entry in instance network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:47:30 np0005601977 nova_compute[183130]: 2026-01-30 09:47:30.652 183134 DEBUG nova.network.neutron [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updating instance_info_cache with network_info: [{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:47:30 np0005601977 nova_compute[183130]: 2026-01-30 09:47:30.714 183134 DEBUG oslo_concurrency.lockutils [req-5c179e05-9213-4258-81ce-baf9b359be2c req-5d35fc6e-98ac-4d8b-89e8-51013eab4514 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:47:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:31Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:b2:18 10.100.0.12
Jan 30 04:47:31 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:31Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:b2:18 10.100.0.12
Jan 30 04:47:32 np0005601977 nova_compute[183130]: 2026-01-30 09:47:32.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:32 np0005601977 nova_compute[183130]: 2026-01-30 09:47:32.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:47:32 np0005601977 nova_compute[183130]: 2026-01-30 09:47:32.365 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:47:32 np0005601977 nova_compute[183130]: 2026-01-30 09:47:32.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:33 np0005601977 nova_compute[183130]: 2026-01-30 09:47:33.277 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:33 np0005601977 podman[227558]: 2026-01-30 09:47:33.840206277 +0000 UTC m=+0.059423166 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.390 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.411 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Triggering sync for uuid 468f5a89-b848-45a6-8649-d09040ab2a09 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.411 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Triggering sync for uuid 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.411 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.412 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.412 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.412 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.445 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:34 np0005601977 nova_compute[183130]: 2026-01-30 09:47:34.448 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:37 np0005601977 nova_compute[183130]: 2026-01-30 09:47:37.728 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:38 np0005601977 nova_compute[183130]: 2026-01-30 09:47:38.279 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:39 np0005601977 nova_compute[183130]: 2026-01-30 09:47:39.112 183134 INFO nova.compute.manager [None req-b87b45bc-c1a2-4255-9183-9d5587e1de30 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Get console output#033[00m
Jan 30 04:47:39 np0005601977 nova_compute[183130]: 2026-01-30 09:47:39.118 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:47:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:40Z|00487|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:47:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:40Z|00488|binding|INFO|Releasing lport 82892d82-7c51-4155-a9fb-5853c775bb76 from this chassis (sb_readonly=0)
Jan 30 04:47:40 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:40Z|00489|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:47:40 np0005601977 nova_compute[183130]: 2026-01-30 09:47:40.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:41 np0005601977 nova_compute[183130]: 2026-01-30 09:47:41.925 183134 INFO nova.compute.manager [None req-d8b9b936-b2dd-4c98-9d9c-e13ba53d0a50 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Get console output#033[00m
Jan 30 04:47:41 np0005601977 nova_compute[183130]: 2026-01-30 09:47:41.929 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:47:42 np0005601977 nova_compute[183130]: 2026-01-30 09:47:42.776 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:42 np0005601977 nova_compute[183130]: 2026-01-30 09:47:42.812 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:43 np0005601977 nova_compute[183130]: 2026-01-30 09:47:43.086 183134 INFO nova.compute.manager [None req-ab69ef54-69f1-416e-b205-2b4462a7612f a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Get console output#033[00m
Jan 30 04:47:43 np0005601977 nova_compute[183130]: 2026-01-30 09:47:43.090 211768 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 30 04:47:43 np0005601977 nova_compute[183130]: 2026-01-30 09:47:43.281 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:43.488 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:47:43 np0005601977 nova_compute[183130]: 2026-01-30 09:47:43.489 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:43 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:43.490 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:47:43 np0005601977 podman[227584]: 2026-01-30 09:47:43.839225052 +0000 UTC m=+0.050859660 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:47:43 np0005601977 podman[227585]: 2026-01-30 09:47:43.845846262 +0000 UTC m=+0.050560512 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.125 183134 DEBUG nova.compute.manager [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.125 183134 DEBUG nova.compute.manager [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing instance network info cache due to event network-changed-029118c4-1eaa-455b-b4e6-533ad78399bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.126 183134 DEBUG oslo_concurrency.lockutils [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.126 183134 DEBUG oslo_concurrency.lockutils [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.126 183134 DEBUG nova.network.neutron [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Refreshing network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.206 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.207 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.207 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.207 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.207 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.208 183134 INFO nova.compute.manager [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Terminating instance#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.209 183134 DEBUG nova.compute.manager [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:47:44 np0005601977 kernel: tap029118c4-1e (unregistering): left promiscuous mode
Jan 30 04:47:44 np0005601977 NetworkManager[55565]: <info>  [1769766464.2461] device (tap029118c4-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:47:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:44Z|00490|binding|INFO|Releasing lport 029118c4-1eaa-455b-b4e6-533ad78399bf from this chassis (sb_readonly=0)
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.250 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:44Z|00491|binding|INFO|Setting lport 029118c4-1eaa-455b-b4e6-533ad78399bf down in Southbound
Jan 30 04:47:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:44Z|00492|binding|INFO|Removing iface tap029118c4-1e ovn-installed in OVS
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.254 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.259 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.261 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:b2:18 10.100.0.12'], port_security=['fa:16:3e:ce:b2:18 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61bbc8ff-22ea-4c6a-a9cb-7483e849e44d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33bba0bc2a744596b558c6598a1970de', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e8e00093-d7b1-4a9a-82d3-105f1c49d3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d05f7c8-4e37-46be-ac35-803db3090f50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=029118c4-1eaa-455b-b4e6-533ad78399bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.262 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 029118c4-1eaa-455b-b4e6-533ad78399bf in datapath b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 unbound from our chassis#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.264 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.266 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[07fb21b7-be39-4894-b4fa-fe2260da61b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.266 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 namespace which is not needed anymore#033[00m
Jan 30 04:47:44 np0005601977 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 30 04:47:44 np0005601977 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000030.scope: Consumed 13.203s CPU time.
Jan 30 04:47:44 np0005601977 systemd-machined[154431]: Machine qemu-39-instance-00000030 terminated.
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [NOTICE]   (227495) : haproxy version is 2.8.14-c23fe91
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [NOTICE]   (227495) : path to executable is /usr/sbin/haproxy
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [WARNING]  (227495) : Exiting Master process...
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [WARNING]  (227495) : Exiting Master process...
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [ALERT]    (227495) : Current worker (227497) exited with code 143 (Terminated)
Jan 30 04:47:44 np0005601977 neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6[227491]: [WARNING]  (227495) : All workers exited. Exiting... (0)
Jan 30 04:47:44 np0005601977 systemd[1]: libpod-4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51.scope: Deactivated successfully.
Jan 30 04:47:44 np0005601977 podman[227650]: 2026-01-30 09:47:44.375922317 +0000 UTC m=+0.037366334 container died 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:47:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51-userdata-shm.mount: Deactivated successfully.
Jan 30 04:47:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay-04733f326d411a168685c5f95941231ae4ea541e16fc88435e8ec5255fe5ac1b-merged.mount: Deactivated successfully.
Jan 30 04:47:44 np0005601977 podman[227650]: 2026-01-30 09:47:44.432040787 +0000 UTC m=+0.093484784 container cleanup 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 30 04:47:44 np0005601977 systemd[1]: libpod-conmon-4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51.scope: Deactivated successfully.
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.452 183134 INFO nova.virt.libvirt.driver [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Instance destroyed successfully.#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.452 183134 DEBUG nova.objects.instance [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'resources' on Instance uuid 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.457 183134 DEBUG nova.compute.manager [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-unplugged-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.457 183134 DEBUG oslo_concurrency.lockutils [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.458 183134 DEBUG oslo_concurrency.lockutils [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.458 183134 DEBUG oslo_concurrency.lockutils [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.458 183134 DEBUG nova.compute.manager [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] No waiting events found dispatching network-vif-unplugged-029118c4-1eaa-455b-b4e6-533ad78399bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.458 183134 DEBUG nova.compute.manager [req-f753943c-5dda-43d4-8486-3b045e1b2ff3 req-18283056-e2ba-45df-8820-2930a8fc95ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-unplugged-029118c4-1eaa-455b-b4e6-533ad78399bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.473 183134 DEBUG nova.virt.libvirt.vif [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:47:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1470633199',display_name='tempest-TestNetworkBasicOps-server-1470633199',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1470633199',id=48,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDtLI36dTVl5yAjyfdZZELwDSZS9wXpCc6/82/IE7B7+EDztXf3ZURYQpFV26TZLdNnJDVIQMy00x+zHb2RtXAjDZ9E52Cat1/yiKqH/3YEKMozEBByujXO07DPs/YgxdA==',key_name='tempest-TestNetworkBasicOps-2080650735',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:47:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='33bba0bc2a744596b558c6598a1970de',ramdisk_id='',reservation_id='r-garak7p0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-345324916',owner_user_name='tempest-TestNetworkBasicOps-345324916-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:47:20Z,user_data=None,user_id='a6dc933f15eb45099f724c1eb5d822e4',uuid=61bbc8ff-22ea-4c6a-a9cb-7483e849e44d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.473 183134 DEBUG nova.network.os_vif_util [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converting VIF {"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.474 183134 DEBUG nova.network.os_vif_util [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.474 183134 DEBUG os_vif [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.475 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.476 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap029118c4-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.479 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.481 183134 INFO os_vif [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b2:18,bridge_name='br-int',has_traffic_filtering=True,id=029118c4-1eaa-455b-b4e6-533ad78399bf,network=Network(b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap029118c4-1e')#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.481 183134 INFO nova.virt.libvirt.driver [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Deleting instance files /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d_del#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.482 183134 INFO nova.virt.libvirt.driver [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Deletion of /var/lib/nova/instances/61bbc8ff-22ea-4c6a-a9cb-7483e849e44d_del complete#033[00m
Jan 30 04:47:44 np0005601977 podman[227692]: 2026-01-30 09:47:44.485278215 +0000 UTC m=+0.038333751 container remove 4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.488 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[43f25f7f-3c7a-47d8-813b-b79210d011ce]: (4, ('Fri Jan 30 09:47:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 (4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51)\n4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51\nFri Jan 30 09:47:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 (4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51)\n4345c49bfc5db1da312bd6dfa9b55e82de789f1181a573da7a072b760c144a51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.490 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c089dc97-c851-467e-b2f4-708bf999d711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.491 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1c30d88-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:44 np0005601977 kernel: tapb1c30d88-20: left promiscuous mode
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.492 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.499 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.498 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[490b15b8-e51e-4589-a4a1-8aa00c53aa8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.521 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8960495e-ae91-4503-97b2-c63e02f0e21d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.522 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2478b5af-4569-4881-a8c6-3591d8a13033]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.531 183134 INFO nova.compute.manager [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.531 183134 DEBUG oslo.service.loopingcall [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.531 183134 DEBUG nova.compute.manager [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:47:44 np0005601977 nova_compute[183130]: 2026-01-30 09:47:44.532 183134 DEBUG nova.network.neutron [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.534 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[309cfae3-1276-440a-a945-7fb89e7473b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497030, 'reachable_time': 18634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227713, 'error': None, 'target': 'ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.536 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:47:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:44.536 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[d5535207-8977-4530-b862-27cf91fa617a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:47:44 np0005601977 systemd[1]: run-netns-ovnmeta\x2db1c30d88\x2d28c6\x2d4bf0\x2d8c8a\x2d517f6e57e2f6.mount: Deactivated successfully.
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.267 183134 DEBUG nova.network.neutron [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.285 183134 INFO nova.compute.manager [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Took 0.75 seconds to deallocate network for instance.#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.323 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.324 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.403 183134 DEBUG nova.compute.provider_tree [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.418 183134 DEBUG nova.scheduler.client.report [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.448 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.479 183134 INFO nova.scheduler.client.report [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.536 183134 DEBUG oslo_concurrency.lockutils [None req-b78c7fb1-f7f6-4cdc-8b34-10a83a5f4195 a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.775 183134 DEBUG nova.network.neutron [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updated VIF entry in instance network info cache for port 029118c4-1eaa-455b-b4e6-533ad78399bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.775 183134 DEBUG nova.network.neutron [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Updating instance_info_cache with network_info: [{"id": "029118c4-1eaa-455b-b4e6-533ad78399bf", "address": "fa:16:3e:ce:b2:18", "network": {"id": "b1c30d88-28c6-4bf0-8c8a-517f6e57e2f6", "bridge": "br-int", "label": "tempest-network-smoke--1388393092", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "33bba0bc2a744596b558c6598a1970de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap029118c4-1e", "ovs_interfaceid": "029118c4-1eaa-455b-b4e6-533ad78399bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:47:45 np0005601977 nova_compute[183130]: 2026-01-30 09:47:45.792 183134 DEBUG oslo_concurrency.lockutils [req-6af69fab-59b9-4e39-bf18-a1ce4f757b6e req-67a31896-6a42-4d8d-be5b-9c5acd92d9bf dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-61bbc8ff-22ea-4c6a-a9cb-7483e849e44d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.183 183134 DEBUG nova.compute.manager [req-bdba52bd-a9e1-4056-97f5-a638c6c5475d req-8770374f-1604-4876-baad-7eea5b9494e0 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-deleted-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.565 183134 DEBUG nova.compute.manager [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.565 183134 DEBUG oslo_concurrency.lockutils [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.566 183134 DEBUG oslo_concurrency.lockutils [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.566 183134 DEBUG oslo_concurrency.lockutils [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "61bbc8ff-22ea-4c6a-a9cb-7483e849e44d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.566 183134 DEBUG nova.compute.manager [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] No waiting events found dispatching network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:47:46 np0005601977 nova_compute[183130]: 2026-01-30 09:47:46.566 183134 WARNING nova.compute.manager [req-4df2d974-7699-4d35-9040-168206422824 req-9c5df705-9063-4382-bc53-d19188b14c3e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Received unexpected event network-vif-plugged-029118c4-1eaa-455b-b4e6-533ad78399bf for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:47:47 np0005601977 nova_compute[183130]: 2026-01-30 09:47:47.819 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:48Z|00493|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:47:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:47:48Z|00494|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:47:48 np0005601977 nova_compute[183130]: 2026-01-30 09:47:48.919 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:49 np0005601977 nova_compute[183130]: 2026-01-30 09:47:49.477 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:49 np0005601977 podman[227716]: 2026-01-30 09:47:49.860956437 +0000 UTC m=+0.066471519 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:47:49 np0005601977 podman[227715]: 2026-01-30 09:47:49.867306229 +0000 UTC m=+0.073740108 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:47:52 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:52.491 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:52 np0005601977 nova_compute[183130]: 2026-01-30 09:47:52.820 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:54 np0005601977 nova_compute[183130]: 2026-01-30 09:47:54.479 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'name': 'tempest-TestGettingAddress-server-320003657', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'hostId': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.456 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.471 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/cpu volume: 11510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a086350e-5d18-40f3-a16b-9d4323f9d588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11510000000, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:47:55.456218', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c0905d14-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.86605548, 'message_signature': 'ecf11c71930a11347e19142ca6f0e4f9cec730e3a54bebe2458f4194e3e6c549'}]}, 'timestamp': '2026-01-30 09:47:55.472237', '_unique_id': 'f30ca8c8f1e74ae3b052e3b87130bd0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.477 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 4279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.477 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 1710 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88d23f19-6f67-4426-bf55-283fa8e67433', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4279, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.474601', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c091331a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '89beb3d4c035d7934f27def1d7cbff0a7bce3957ab6b3eb559a86acfb918e786'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1710, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.474601', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c09141ca-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '592296ace23b97747d14ff6cb35a7da2ddbd327e726f727d3473f7ee75afda81'}]}, 'timestamp': '2026-01-30 09:47:55.478013', '_unique_id': '8a68ce949f1c40d6b739027c5f27b9da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.479 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/memory.usage volume: 43.66015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '935be20e-ba53-41cb-ade1-0246f1139aeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.66015625, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:47:55.479953', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c09199b8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.86605548, 'message_signature': '6e9d36d05c1ab32812b621b834a2b43d5706510dc8dfecc9cbd3e822dd637a89'}]}, 'timestamp': '2026-01-30 09:47:55.480275', '_unique_id': '060e2fe7236b4392a257b60206cc1804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.481 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.481 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.490 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.490 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23f7c26a-0cd4-4748-b1e5-69404d13bd3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.481865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c0933b74-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': '3e79331163bbb8ed10249167bce36d9be1a90102e6aa731f56414dbd08e2fe03'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.481865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c09346aa-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': 'f0a914519b00c78fa92c32b743ab8cf856a7c8384b18237bea0224739218b7af'}]}, 'timestamp': '2026-01-30 09:47:55.491241', '_unique_id': 'a07ba4a4deae486ca37e8765c8ec04ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.492 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.493 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.493 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 684 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86c127ec-2422-4532-a0f1-18647f58f671', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.493069', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c09399b6-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '5aa0b07cd531df1e6a46fa295464d89ee452ebd5768a65e4f42a9297befc03cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 684, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.493069', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c093a1f4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '3c84cff16bbfd7cb351497c5ffe4a4b5287a4f31a2f02eab8b715d239fa11452'}]}, 'timestamp': '2026-01-30 09:47:55.493529', '_unique_id': '81d24452ba3b4c15ab0d9afa09b2eacc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.494 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.495 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c81162d2-347f-4cf0-a13c-0600dbdb4ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.494714', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c093e128-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '7907b1a5d439da3fe82a1438ec170a742a6ce802f560442bfce7df007882f586'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.494714', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c093ebdc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'f8c9f82043ca1067dcbdd18a3b8260a465e6341b336d5acec8bef0247e8701da'}]}, 'timestamp': '2026-01-30 09:47:55.495463', '_unique_id': '8e0672adfcc74a0488764ae1a6a80995'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.496 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09b37858-a2d2-4119-a1bc-b42fe5ff4f69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.496687', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c0942688-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'aa60f5daa32764bc406b7db00a3fcd3ec503c9bf20ffd2303e0a36dd79fc4fa9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.496687', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c0943010-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'bfb788ffe776fd0c75fa538bb2efcc2e453caa43eed2e3e6964fbbdbf7bd9cb3'}]}, 'timestamp': '2026-01-30 09:47:55.497186', '_unique_id': 'db256e9f41154888b956130f53e1c4a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.498 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.498 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.498 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '083efece-2581-4b80-9fdf-6d915537eda2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.498384', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c0946a8a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '068dffdd295063cc6e5145337f6a68ef02003fd87ecd06719d8039e6f3577750'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.498384', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c09473cc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'c49683ee0b0bdc5c45560fa5f8b734b09f15d6a2ec292c23be1ee9d0425d1830'}]}, 'timestamp': '2026-01-30 09:47:55.498913', '_unique_id': '0158d2105f2d43af988ec27083541dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.500 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.500 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6035ba9e-1f6e-4ad5-b991-df8447314d24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.500339', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c094b4f4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '393f3328d603eba924f269ce86c2cea43992a265bf9888572e656c2435f922cd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.500339', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c094bce2-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '8d0e5cd54f19e836f47f8485426f9b8aae9e18c3510892bc5af7edb9d2a13de9'}]}, 'timestamp': '2026-01-30 09:47:55.500795', '_unique_id': '5f4393e038e746ed8298ab9027085596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.522 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 489043333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.522 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 43086902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32b18af8-0c46-4eab-8afc-af334bad2578', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 489043333, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.501857', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09817e8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'a7c04f5a85c338068c168eaedba556c8e9194fbf72d2a3a2bcf19f2b938a91ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43086902, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.501857', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c098256c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '040c4972b27ce3756f95c76d28780ce826ec66c17bbe2a925dd67c4c494e9992'}]}, 'timestamp': '2026-01-30 09:47:55.523158', '_unique_id': '6f3f8807e0b64ca29558fbc78ef0d7f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.524 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.525 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18dc044d-5f79-4981-9cbe-4d3c98190bb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.524778', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09870e4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': '89effae51a1510733deacacbc70d9520fd4bebad4a186fc25bcb821fdb2ac854'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.524778', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c0987cec-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': '3303e8daad8c8b8b41aa7772452a44416221917eb2df53ed2fcb7db36a30cef3'}]}, 'timestamp': '2026-01-30 09:47:55.525393', '_unique_id': 'd9cf350fe3314d5b9abc99567d8e7b2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.526 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 84 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.527 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97467566-ef64-4f81-bdff-bf55fa4df8b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 84, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.526898', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c098c3f0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'accf5f884a7ada6231caf20691eda297197986f1231a8a7051e250d0edd2bb15'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.526898', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c098d084-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'ee79cd65f99d31396cc046ff567a378dbe425070b45d37187bf1459b75a4f66b'}]}, 'timestamp': '2026-01-30 09:47:55.527542', '_unique_id': '0e74700209eb4702b416c17b5dee0fa0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.529 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 1063 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.529 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3292ed4a-64c4-42c9-ab8e-b92f99eb645a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1063, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.529130', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c0991bc0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '55fcf942f8abeadc3b1ad008ffe35f1ab4f7dac40849ac046915e034751b56c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.529130', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c099270a-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '61da6a9fa9cfb67c77983462483e6ee674d30b9933a5f9f500a1ebcc1c2a8d99'}]}, 'timestamp': '2026-01-30 09:47:55.529749', '_unique_id': 'fc6459de79654bb1bfdd9d77dae316b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.531 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 29415936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.532 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '614befc1-cf59-4dd2-b339-a3aa80b63016', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29415936, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.531911', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09987f4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '04e33b46f73729215c2f681bc26b3537a0c4c6498b15c71cfcbbef80af4e6475'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.531911', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c0999460-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'ca7b0e0c329d8495030ab66d6c63a4ba8828fff68cf7aaf5949c43bc08667517'}]}, 'timestamp': '2026-01-30 09:47:55.532547', '_unique_id': '0024c20437254e2ab7bd895bcd8b3b5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.534 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.534 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b37701d-887a-47f4-a205-1cf8468497fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 336, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.534102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c099ddbc-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'b566d412271b8a773f6731b3cfc1dd7cfaf1333ce8c5c380e6a4e7caf9e3b168'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.534102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c099e8ca-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'f7821471d9297b5d5a6a85871cd33950d9518cec214ece277955e53d7238f458'}]}, 'timestamp': '2026-01-30 09:47:55.534704', '_unique_id': '582f33362dd240cdba1ae1c8b419b115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.536 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.536 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 3392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7a6a0e1-4c3a-4491-bfe7-d886b4adebf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.536141', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c09a2e16-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'ff1c8dbba9fe3814e811e5d834bda7a156a43b5957685a6463219d8ec3bc57c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3392, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.536141', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c09a399c-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'e08471b0fd7f60be7ddeea38c420dba89f156f3aee2094dfa351faa5eba4472c'}]}, 'timestamp': '2026-01-30 09:47:55.536783', '_unique_id': '92dc47e29be640df82b89b45d3ffbd0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.538 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.538 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe3cb38d-b9d7-439c-b981-42a97d0319e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.538264', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09a7fc4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': '860efa4ea82db8449e2dfa60c90546418de22b1741de02722017bc7492cb2711'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.538264', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c09a8ac8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.876780928, 'message_signature': 'c0b5cee69c1f90ad3fec0a4207ed536a6d81e512752ba75d2440c71565a94d03'}]}, 'timestamp': '2026-01-30 09:47:55.538850', '_unique_id': '69cc822e0b694b8aa516159c3c2fd7aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.540 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.540 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1260e1b4-8fe9-487f-8016-74759e0586cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.540283', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c09aceca-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '79ee281edfc2e230f3c0928b757847b72132bdde1f0a293c056971f0dd9d5c35'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.540283', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c09ada50-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'e73e10fe408d16d5cb69d33575130b0cd57184bf382d7d6d83fc0256c407ba32'}]}, 'timestamp': '2026-01-30 09:47:55.540898', '_unique_id': '6d7cfd5c7ed04d2ab7e6c75378dacf62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.542 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.542 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f5912fe-40cc-4123-8171-23f2a0c4dff4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.542384', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09b20a0-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '9e709c8ae16857109c8397a9fc58bbecc402c1dce914e2644e1da1f85ce8a1b6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.542384', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c09b2cf8-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': '55459808335b1aee81cd132d1dd8f0570af257243133c3d3567ef5b03d6ee575'}]}, 'timestamp': '2026-01-30 09:47:55.543002', '_unique_id': '663d9c91c130488592b57df3bbaecbac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.544 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 1625763224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.544 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49f726e8-70b7-41ff-b8af-c9d60365e0d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1625763224, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:47:55.544445', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c09b7104-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'cf8693fcf7a99108aa46f2b36a7e0249745e2c92016a0473f295eabb1f8d2f5c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:47:55.544445', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c09b7bf4-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.896749071, 'message_signature': 'f9bf166482c9662cadf60b04ce6b1f89f3f404267988d27e1b64a44b5a1b0691'}]}, 'timestamp': '2026-01-30 09:47:55.545024', '_unique_id': '6a056e7922674861aedcde022b71befd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.546 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.546 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.546 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5a26c97-17f7-4eb8-a507-6d260a5b5ed4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:47:55.546477', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': 'c09bc096-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': '9b74f2151c78710d7e77a9d51d616990e71c3f09f24f6aef74c27e04e1caa785'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:47:55.546477', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': 'c09bcc26-fdc0-11f0-a471-fa163eabe782', 'monotonic_time': 5006.869511849, 'message_signature': 'c3cf204d1dd348d4be9f417a4a2640e8143f50815c6d3dfe4991639cf36b73dc'}]}, 'timestamp': '2026-01-30 09:47:55.547088', '_unique_id': '5eba530606ab4cdf945c2c7f18491355'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:47:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:47:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:47:55 np0005601977 podman[227756]: 2026-01-30 09:47:55.853857533 +0000 UTC m=+0.073232053 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:47:56 np0005601977 nova_compute[183130]: 2026-01-30 09:47:56.835 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:57.400 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:57.401 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:47:57.401 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:57 np0005601977 nova_compute[183130]: 2026-01-30 09:47:57.823 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:59 np0005601977 nova_compute[183130]: 2026-01-30 09:47:59.451 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766464.4506893, 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:47:59 np0005601977 nova_compute[183130]: 2026-01-30 09:47:59.452 183134 INFO nova.compute.manager [-] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:47:59 np0005601977 nova_compute[183130]: 2026-01-30 09:47:59.475 183134 DEBUG nova.compute.manager [None req-558db90c-36e6-4741-a974-e19ecac8c02a - - - - - -] [instance: 61bbc8ff-22ea-4c6a-a9cb-7483e849e44d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:47:59 np0005601977 nova_compute[183130]: 2026-01-30 09:47:59.481 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:47:59 np0005601977 nova_compute[183130]: 2026-01-30 09:47:59.912 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:02 np0005601977 nova_compute[183130]: 2026-01-30 09:48:02.825 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:04 np0005601977 nova_compute[183130]: 2026-01-30 09:48:04.484 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:04 np0005601977 podman[227784]: 2026-01-30 09:48:04.849801399 +0000 UTC m=+0.056368359 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:48:04 np0005601977 nova_compute[183130]: 2026-01-30 09:48:04.901 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:07 np0005601977 nova_compute[183130]: 2026-01-30 09:48:07.828 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:08 np0005601977 nova_compute[183130]: 2026-01-30 09:48:08.709 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:09 np0005601977 nova_compute[183130]: 2026-01-30 09:48:09.486 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.375 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.377 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.455 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.506 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.507 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.552 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.691 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.692 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5459MB free_disk=73.21946334838867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.692 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.692 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.869 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.891 183134 INFO nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 3c0278c5-e992-4ef6-bf28-99033c90ee64 has allocations against this compute host but is not found in the database.#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.891 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.892 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:48:11 np0005601977 nova_compute[183130]: 2026-01-30 09:48:11.991 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.033 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.034 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.051 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.075 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.075 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.091 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.111 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.138 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.159 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.172 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.191 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.192 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.192 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.201 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.201 183134 INFO nova.compute.claims [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.324 183134 DEBUG nova.compute.provider_tree [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.342 183134 DEBUG nova.scheduler.client.report [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.363 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.364 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.417 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.418 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.473 183134 INFO nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.536 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.624 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.627 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.627 183134 INFO nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Creating image(s)#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.628 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.629 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.630 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.665 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.749 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.750 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.751 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.777 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.832 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.839 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.840 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.866 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.867 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.868 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.937 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.938 183134 DEBUG nova.virt.disk.api [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Checking if we can resize image /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.938 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.991 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.992 183134 DEBUG nova.virt.disk.api [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Cannot resize image /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:48:12 np0005601977 nova_compute[183130]: 2026-01-30 09:48:12.993 183134 DEBUG nova.objects.instance [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lazy-loading 'migration_context' on Instance uuid 3c0278c5-e992-4ef6-bf28-99033c90ee64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:48:13 np0005601977 nova_compute[183130]: 2026-01-30 09:48:13.007 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:48:13 np0005601977 nova_compute[183130]: 2026-01-30 09:48:13.008 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Ensure instance console log exists: /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:48:13 np0005601977 nova_compute[183130]: 2026-01-30 09:48:13.008 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:13 np0005601977 nova_compute[183130]: 2026-01-30 09:48:13.008 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:13 np0005601977 nova_compute[183130]: 2026-01-30 09:48:13.009 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:14 np0005601977 nova_compute[183130]: 2026-01-30 09:48:14.488 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:14 np0005601977 podman[227827]: 2026-01-30 09:48:14.852041416 +0000 UTC m=+0.067213680 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, architecture=x86_64, release=1769056855, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git)
Jan 30 04:48:14 np0005601977 podman[227828]: 2026-01-30 09:48:14.85738516 +0000 UTC m=+0.068692383 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:48:15 np0005601977 nova_compute[183130]: 2026-01-30 09:48:15.704 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Successfully created port: f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.204 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Successfully updated port: f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.230 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.231 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquired lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.231 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.308 183134 DEBUG nova.compute.manager [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-changed-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.309 183134 DEBUG nova.compute.manager [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Refreshing instance network info cache due to event network-changed-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.310 183134 DEBUG oslo_concurrency.lockutils [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.416 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:17 np0005601977 nova_compute[183130]: 2026-01-30 09:48:17.873 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.761 183134 DEBUG nova.network.neutron [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Updating instance_info_cache with network_info: [{"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.782 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Releasing lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.783 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Instance network_info: |[{"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.783 183134 DEBUG oslo_concurrency.lockutils [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.784 183134 DEBUG nova.network.neutron [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Refreshing network info cache for port f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.788 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Start _get_guest_xml network_info=[{"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.794 183134 WARNING nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.800 183134 DEBUG nova.virt.libvirt.host [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.801 183134 DEBUG nova.virt.libvirt.host [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.809 183134 DEBUG nova.virt.libvirt.host [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.809 183134 DEBUG nova.virt.libvirt.host [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.811 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.812 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.813 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.813 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.813 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.814 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.814 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.815 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.815 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.816 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.816 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.817 183134 DEBUG nova.virt.hardware [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.822 183134 DEBUG nova.virt.libvirt.vif [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:48:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-636368625',display_name='tempest-TestServerMultinode-server-636368625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-636368625',id=49,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d1f6575f78b14c7d87ed252e603d0a6d',ramdisk_id='',reservation_id='r-an9lk7m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1725111207',owner_user_name='tempest-TestServerMultinode-1725111207-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:48:12Z,user_data=None,user_id='bc74e93bd60c4618af5bc22cb2c9535e',uuid=3c0278c5-e992-4ef6-bf28-99033c90ee64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.823 183134 DEBUG nova.network.os_vif_util [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converting VIF {"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.824 183134 DEBUG nova.network.os_vif_util [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.825 183134 DEBUG nova.objects.instance [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c0278c5-e992-4ef6-bf28-99033c90ee64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.850 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <uuid>3c0278c5-e992-4ef6-bf28-99033c90ee64</uuid>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <name>instance-00000031</name>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestServerMultinode-server-636368625</nova:name>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:48:18</nova:creationTime>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:user uuid="bc74e93bd60c4618af5bc22cb2c9535e">tempest-TestServerMultinode-1725111207-project-admin</nova:user>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:project uuid="d1f6575f78b14c7d87ed252e603d0a6d">tempest-TestServerMultinode-1725111207</nova:project>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        <nova:port uuid="f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="serial">3c0278c5-e992-4ef6-bf28-99033c90ee64</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="uuid">3c0278c5-e992-4ef6-bf28-99033c90ee64</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.config"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:51:f4:73"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <target dev="tapf52a88e4-1c"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/console.log" append="off"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:48:18 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:48:18 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:48:18 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:48:18 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.851 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Preparing to wait for external event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.852 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.852 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.853 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.854 183134 DEBUG nova.virt.libvirt.vif [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:48:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-636368625',display_name='tempest-TestServerMultinode-server-636368625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-636368625',id=49,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d1f6575f78b14c7d87ed252e603d0a6d',ramdisk_id='',reservation_id='r-an9lk7m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1725111207',owner_user_name='tempest-TestServerMultinode-1725111207-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:48:12Z,user_data=None,user_id='bc74e93bd60c4618af5bc22cb2c9535e',uuid=3c0278c5-e992-4ef6-bf28-99033c90ee64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.854 183134 DEBUG nova.network.os_vif_util [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converting VIF {"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.855 183134 DEBUG nova.network.os_vif_util [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.856 183134 DEBUG os_vif [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.857 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.858 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.858 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.862 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.863 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf52a88e4-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.864 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf52a88e4-1c, col_values=(('external_ids', {'iface-id': 'f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:f4:73', 'vm-uuid': '3c0278c5-e992-4ef6-bf28-99033c90ee64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.866 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:18 np0005601977 NetworkManager[55565]: <info>  [1769766498.8683] manager: (tapf52a88e4-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.868 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.875 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.877 183134 INFO os_vif [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c')#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.927 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.928 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.928 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] No VIF found with MAC fa:16:3e:51:f4:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:48:18 np0005601977 nova_compute[183130]: 2026-01-30 09:48:18.928 183134 INFO nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Using config drive#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.642 183134 INFO nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Creating config drive at /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.config#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.646 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprz700zrc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.763 183134 DEBUG oslo_concurrency.processutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprz700zrc" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:19 np0005601977 kernel: tapf52a88e4-1c: entered promiscuous mode
Jan 30 04:48:19 np0005601977 NetworkManager[55565]: <info>  [1769766499.8230] manager: (tapf52a88e4-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 30 04:48:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:19Z|00495|binding|INFO|Claiming lport f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e for this chassis.
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.822 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:19Z|00496|binding|INFO|f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e: Claiming fa:16:3e:51:f4:73 10.100.0.13
Jan 30 04:48:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:19Z|00497|binding|INFO|Setting lport f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e up in Southbound
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.830 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:f4:73 10.100.0.13'], port_security=['fa:16:3e:51:f4:73 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3c0278c5-e992-4ef6-bf28-99033c90ee64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd1f6575f78b14c7d87ed252e603d0a6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '990ab1f8-b356-4643-ab81-e26fc7994d7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccc9cad1-b1b5-42fa-b4d7-01d8d6c8aad6, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.831 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:19Z|00498|binding|INFO|Setting lport f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e ovn-installed in OVS
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.832 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e in datapath 098dfb1c-7f08-4bd3-9713-11f8f2a9351c bound to our chassis#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.832 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.833 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 098dfb1c-7f08-4bd3-9713-11f8f2a9351c#033[00m
Jan 30 04:48:19 np0005601977 nova_compute[183130]: 2026-01-30 09:48:19.837 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.844 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1666a259-d1e8-4e49-9cc4-052472a45795]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.846 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap098dfb1c-71 in ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.848 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap098dfb1c-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.848 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[df6c3ba5-b3fb-42cb-93d0-0cdbc4200e89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.849 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7b869d21-9173-40fd-8bd1-ab754f02cb4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 systemd-udevd[227888]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:48:19 np0005601977 systemd-machined[154431]: New machine qemu-40-instance-00000031.
Jan 30 04:48:19 np0005601977 NetworkManager[55565]: <info>  [1769766499.8633] device (tapf52a88e4-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:48:19 np0005601977 NetworkManager[55565]: <info>  [1769766499.8641] device (tapf52a88e4-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:48:19 np0005601977 systemd[1]: Started Virtual Machine qemu-40-instance-00000031.
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.864 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc533b0-9a40-4602-8d25-da87f7183057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.877 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2e396d90-2da2-46ce-9ff1-e0c99a54ca4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.904 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[3af1b603-8205-4ae7-be7a-140ef4724f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.911 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7db7f956-07ec-41a6-974b-222411985b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 systemd-udevd[227891]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:48:19 np0005601977 NetworkManager[55565]: <info>  [1769766499.9127] manager: (tap098dfb1c-70): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.940 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[dee935f1-f696-4dde-bb69-b5c31c39e271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.942 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e03aea-6e56-43bf-b24e-a672ae0f637c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 podman[227895]: 2026-01-30 09:48:19.957587243 +0000 UTC m=+0.058639364 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:48:19 np0005601977 podman[227893]: 2026-01-30 09:48:19.958281623 +0000 UTC m=+0.066299294 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:48:19 np0005601977 NetworkManager[55565]: <info>  [1769766499.9603] device (tap098dfb1c-70): carrier: link connected
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.964 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a8f5bc-a898-4771-879b-6a938fcd2657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.981 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1d88f2-bd8a-4581-b245-21676f0731cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap098dfb1c-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:db:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503130, 'reachable_time': 38115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227962, 'error': None, 'target': 'ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:19.993 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7766efe6-63ae-4cac-9e83-a675e2bf0fc0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:db92'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503130, 'tstamp': 503130}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227963, 'error': None, 'target': 'ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.005 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8cab11e1-f778-4cb9-b40e-872982ae8e0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap098dfb1c-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:db:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503130, 'reachable_time': 38115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227964, 'error': None, 'target': 'ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.024 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a914e8f0-658d-47ab-9145-e233b829193a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.065 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cffe178a-98ab-4a32-8a3f-11ea341c0f04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.066 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap098dfb1c-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.067 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.067 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap098dfb1c-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:20 np0005601977 kernel: tap098dfb1c-70: entered promiscuous mode
Jan 30 04:48:20 np0005601977 NetworkManager[55565]: <info>  [1769766500.0702] manager: (tap098dfb1c-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.071 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.073 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap098dfb1c-70, col_values=(('external_ids', {'iface-id': '1e40bb28-b30e-47c9-9279-d90df998ee62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.074 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:20Z|00499|binding|INFO|Releasing lport 1e40bb28-b30e-47c9-9279-d90df998ee62 from this chassis (sb_readonly=0)
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.075 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.075 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/098dfb1c-7f08-4bd3-9713-11f8f2a9351c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/098dfb1c-7f08-4bd3-9713-11f8f2a9351c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.076 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0d33d2-b200-4218-a5b7-871858e1b144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.076 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-098dfb1c-7f08-4bd3-9713-11f8f2a9351c
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/098dfb1c-7f08-4bd3-9713-11f8f2a9351c.pid.haproxy
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 098dfb1c-7f08-4bd3-9713-11f8f2a9351c
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:48:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:20.077 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'env', 'PROCESS_TAG=haproxy-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/098dfb1c-7f08-4bd3-9713-11f8f2a9351c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.079 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:20 np0005601977 podman[227996]: 2026-01-30 09:48:20.415333822 +0000 UTC m=+0.046849146 container create b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:48:20 np0005601977 systemd[1]: Started libpod-conmon-b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c.scope.
Jan 30 04:48:20 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:48:20 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4937dce7a25eaa7ed6a436120757c2a100abec5959cb6992594c24c7a30cdfef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:48:20 np0005601977 podman[227996]: 2026-01-30 09:48:20.474674465 +0000 UTC m=+0.106189809 container init b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:48:20 np0005601977 podman[227996]: 2026-01-30 09:48:20.480947895 +0000 UTC m=+0.112463219 container start b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:48:20 np0005601977 podman[227996]: 2026-01-30 09:48:20.393749812 +0000 UTC m=+0.025265156 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:48:20 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [NOTICE]   (228016) : New worker (228018) forked
Jan 30 04:48:20 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [NOTICE]   (228016) : Loading success.
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.648 183134 DEBUG nova.compute.manager [req-ab247703-fff5-4782-8740-ee311ef2a683 req-83063b0c-ca98-4346-9d16-66665709849f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.649 183134 DEBUG oslo_concurrency.lockutils [req-ab247703-fff5-4782-8740-ee311ef2a683 req-83063b0c-ca98-4346-9d16-66665709849f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.650 183134 DEBUG oslo_concurrency.lockutils [req-ab247703-fff5-4782-8740-ee311ef2a683 req-83063b0c-ca98-4346-9d16-66665709849f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.650 183134 DEBUG oslo_concurrency.lockutils [req-ab247703-fff5-4782-8740-ee311ef2a683 req-83063b0c-ca98-4346-9d16-66665709849f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.650 183134 DEBUG nova.compute.manager [req-ab247703-fff5-4782-8740-ee311ef2a683 req-83063b0c-ca98-4346-9d16-66665709849f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Processing event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.651 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766500.6464078, 3c0278c5-e992-4ef6-bf28-99033c90ee64 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.651 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] VM Started (Lifecycle Event)#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.653 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.657 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.662 183134 INFO nova.virt.libvirt.driver [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Instance spawned successfully.#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.662 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.681 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.686 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.689 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.690 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.690 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.691 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.691 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.691 183134 DEBUG nova.virt.libvirt.driver [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.736 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.737 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766500.6492143, 3c0278c5-e992-4ef6-bf28-99033c90ee64 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.737 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.768 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.772 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766500.6567554, 3c0278c5-e992-4ef6-bf28-99033c90ee64 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.773 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.797 183134 INFO nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Took 8.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.798 183134 DEBUG nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.800 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.806 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.841 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.862 183134 INFO nova.compute.manager [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Took 8.75 seconds to build instance.#033[00m
Jan 30 04:48:20 np0005601977 nova_compute[183130]: 2026-01-30 09:48:20.877 183134 DEBUG oslo_concurrency.lockutils [None req-a58d9686-323e-4a96-9068-13e17e084974 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.053 183134 DEBUG nova.network.neutron [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Updated VIF entry in instance network info cache for port f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.055 183134 DEBUG nova.network.neutron [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Updating instance_info_cache with network_info: [{"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.073 183134 DEBUG oslo_concurrency.lockutils [req-4b9f48df-0eee-44b0-9762-e3d7f0b0bfcf req-51742557-2f30-447c-946c-10d244020f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-3c0278c5-e992-4ef6-bf28-99033c90ee64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.192 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:21 np0005601977 nova_compute[183130]: 2026-01-30 09:48:21.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.748 183134 DEBUG nova.compute.manager [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.750 183134 DEBUG oslo_concurrency.lockutils [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.751 183134 DEBUG oslo_concurrency.lockutils [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.751 183134 DEBUG oslo_concurrency.lockutils [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.751 183134 DEBUG nova.compute.manager [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] No waiting events found dispatching network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.752 183134 WARNING nova.compute.manager [req-1a97f211-be34-407f-831c-25baafb63db1 req-040f8c1d-19f8-4de1-be5b-78d60261d1e9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received unexpected event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e for instance with vm_state active and task_state None.#033[00m
Jan 30 04:48:22 np0005601977 nova_compute[183130]: 2026-01-30 09:48:22.912 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:23 np0005601977 nova_compute[183130]: 2026-01-30 09:48:23.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:23 np0005601977 nova_compute[183130]: 2026-01-30 09:48:23.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:23 np0005601977 nova_compute[183130]: 2026-01-30 09:48:23.345 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:23 np0005601977 nova_compute[183130]: 2026-01-30 09:48:23.866 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.518 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.518 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.519 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:48:24 np0005601977 nova_compute[183130]: 2026-01-30 09:48:24.519 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:48:26 np0005601977 podman[228034]: 2026-01-30 09:48:26.895929277 +0000 UTC m=+0.111782990 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 30 04:48:27 np0005601977 nova_compute[183130]: 2026-01-30 09:48:27.915 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:28 np0005601977 nova_compute[183130]: 2026-01-30 09:48:28.506 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:28 np0005601977 nova_compute[183130]: 2026-01-30 09:48:28.521 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:28 np0005601977 nova_compute[183130]: 2026-01-30 09:48:28.521 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:48:28 np0005601977 nova_compute[183130]: 2026-01-30 09:48:28.910 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:32 np0005601977 nova_compute[183130]: 2026-01-30 09:48:32.950 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:33Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:51:f4:73 10.100.0.13
Jan 30 04:48:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:33Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:f4:73 10.100.0.13
Jan 30 04:48:33 np0005601977 nova_compute[183130]: 2026-01-30 09:48:33.912 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:34 np0005601977 nova_compute[183130]: 2026-01-30 09:48:34.515 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:35 np0005601977 podman[228072]: 2026-01-30 09:48:35.833503907 +0000 UTC m=+0.049779200 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:48:38 np0005601977 nova_compute[183130]: 2026-01-30 09:48:38.004 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:38 np0005601977 nova_compute[183130]: 2026-01-30 09:48:38.915 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:43 np0005601977 nova_compute[183130]: 2026-01-30 09:48:43.007 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:43 np0005601977 nova_compute[183130]: 2026-01-30 09:48:43.918 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.239 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.240 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.240 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.240 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.241 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.242 183134 INFO nova.compute.manager [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Terminating instance#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.244 183134 DEBUG nova.compute.manager [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:48:44 np0005601977 kernel: tapf52a88e4-1c (unregistering): left promiscuous mode
Jan 30 04:48:44 np0005601977 NetworkManager[55565]: <info>  [1769766524.2698] device (tapf52a88e4-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:48:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:44Z|00500|binding|INFO|Releasing lport f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e from this chassis (sb_readonly=0)
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.273 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:44Z|00501|binding|INFO|Setting lport f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e down in Southbound
Jan 30 04:48:44 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:44Z|00502|binding|INFO|Removing iface tapf52a88e4-1c ovn-installed in OVS
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.276 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.284 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:f4:73 10.100.0.13'], port_security=['fa:16:3e:51:f4:73 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3c0278c5-e992-4ef6-bf28-99033c90ee64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd1f6575f78b14c7d87ed252e603d0a6d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '990ab1f8-b356-4643-ab81-e26fc7994d7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccc9cad1-b1b5-42fa-b4d7-01d8d6c8aad6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.286 104706 INFO neutron.agent.ovn.metadata.agent [-] Port f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e in datapath 098dfb1c-7f08-4bd3-9713-11f8f2a9351c unbound from our chassis#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.288 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 098dfb1c-7f08-4bd3-9713-11f8f2a9351c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.289 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.291 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[47a45a98-cb68-4fc3-8e81-6e15e691b9f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.292 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c namespace which is not needed anymore#033[00m
Jan 30 04:48:44 np0005601977 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 30 04:48:44 np0005601977 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000031.scope: Consumed 12.945s CPU time.
Jan 30 04:48:44 np0005601977 systemd-machined[154431]: Machine qemu-40-instance-00000031 terminated.
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [NOTICE]   (228016) : haproxy version is 2.8.14-c23fe91
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [NOTICE]   (228016) : path to executable is /usr/sbin/haproxy
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [WARNING]  (228016) : Exiting Master process...
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [WARNING]  (228016) : Exiting Master process...
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [ALERT]    (228016) : Current worker (228018) exited with code 143 (Terminated)
Jan 30 04:48:44 np0005601977 neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c[228012]: [WARNING]  (228016) : All workers exited. Exiting... (0)
Jan 30 04:48:44 np0005601977 systemd[1]: libpod-b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c.scope: Deactivated successfully.
Jan 30 04:48:44 np0005601977 podman[228124]: 2026-01-30 09:48:44.452814222 +0000 UTC m=+0.055400088 container died b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.469 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.474 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c-userdata-shm.mount: Deactivated successfully.
Jan 30 04:48:44 np0005601977 systemd[1]: var-lib-containers-storage-overlay-4937dce7a25eaa7ed6a436120757c2a100abec5959cb6992594c24c7a30cdfef-merged.mount: Deactivated successfully.
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.505 183134 INFO nova.virt.libvirt.driver [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Instance destroyed successfully.#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.505 183134 DEBUG nova.objects.instance [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lazy-loading 'resources' on Instance uuid 3c0278c5-e992-4ef6-bf28-99033c90ee64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:48:44 np0005601977 podman[228124]: 2026-01-30 09:48:44.507114867 +0000 UTC m=+0.109700733 container cleanup b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.520 183134 DEBUG nova.virt.libvirt.vif [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:48:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-636368625',display_name='tempest-TestServerMultinode-server-636368625',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-636368625',id=49,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:48:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d1f6575f78b14c7d87ed252e603d0a6d',ramdisk_id='',reservation_id='r-an9lk7m2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1725111207',owner_user_name='tempest-TestServerMultinode-1725111207-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:48:20Z,user_data=None,user_id='bc74e93bd60c4618af5bc22cb2c9535e',uuid=3c0278c5-e992-4ef6-bf28-99033c90ee64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.521 183134 DEBUG nova.network.os_vif_util [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converting VIF {"id": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "address": "fa:16:3e:51:f4:73", "network": {"id": "098dfb1c-7f08-4bd3-9713-11f8f2a9351c", "bridge": "br-int", "label": "tempest-TestServerMultinode-1027074824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84d458dbf0064c27856c66d5dfd435e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf52a88e4-1c", "ovs_interfaceid": "f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.522 183134 DEBUG nova.network.os_vif_util [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.523 183134 DEBUG os_vif [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:48:44 np0005601977 systemd[1]: libpod-conmon-b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c.scope: Deactivated successfully.
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.524 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.525 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf52a88e4-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.527 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.529 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.531 183134 INFO os_vif [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:f4:73,bridge_name='br-int',has_traffic_filtering=True,id=f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e,network=Network(098dfb1c-7f08-4bd3-9713-11f8f2a9351c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf52a88e4-1c')#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.532 183134 INFO nova.virt.libvirt.driver [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Deleting instance files /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64_del#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.533 183134 INFO nova.virt.libvirt.driver [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Deletion of /var/lib/nova/instances/3c0278c5-e992-4ef6-bf28-99033c90ee64_del complete#033[00m
Jan 30 04:48:44 np0005601977 podman[228171]: 2026-01-30 09:48:44.574225849 +0000 UTC m=+0.044859336 container remove b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.581 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3a0113-f467-4d1e-9dc6-a217d8812040]: (4, ('Fri Jan 30 09:48:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c (b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c)\nb0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c\nFri Jan 30 09:48:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c (b0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c)\nb0a32ea5a64e35120eddff1cba38c6f8c8e14e17344d090a4322dbbd6900616c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.584 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b9d18c-bb1f-4d48-9e4f-df11f8811195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.586 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap098dfb1c-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.589 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 kernel: tap098dfb1c-70: left promiscuous mode
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.599 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.601 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.603 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[837ba974-86ec-44f0-a21d-1c0e0295056c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.626 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aac50e32-b49e-4f74-9c85-fc02e4f49f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.629 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[345a6e5a-4c36-4df0-9268-d783309c55fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.651 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e82f6d9c-0458-47c0-80ce-b7220d34b044]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503123, 'reachable_time': 39781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228186, 'error': None, 'target': 'ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 systemd[1]: run-netns-ovnmeta\x2d098dfb1c\x2d7f08\x2d4bd3\x2d9713\x2d11f8f2a9351c.mount: Deactivated successfully.
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.656 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-098dfb1c-7f08-4bd3-9713-11f8f2a9351c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:48:44 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:44.657 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[21bcff7d-36ab-4d04-b371-ad64cfc95100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.825 183134 DEBUG nova.compute.manager [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-unplugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.826 183134 DEBUG oslo_concurrency.lockutils [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.826 183134 DEBUG oslo_concurrency.lockutils [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.826 183134 DEBUG oslo_concurrency.lockutils [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.826 183134 DEBUG nova.compute.manager [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] No waiting events found dispatching network-vif-unplugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.826 183134 DEBUG nova.compute.manager [req-493875c5-efa9-4461-82b3-03b691be34a2 req-038af629-7308-4b59-92e1-fbb5ee79dc89 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-unplugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.835 183134 INFO nova.compute.manager [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.836 183134 DEBUG oslo.service.loopingcall [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.836 183134 DEBUG nova.compute.manager [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:48:44 np0005601977 nova_compute[183130]: 2026-01-30 09:48:44.837 183134 DEBUG nova.network.neutron [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.136 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:45.137 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:48:45 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:45.140 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.517 183134 DEBUG nova.network.neutron [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.541 183134 INFO nova.compute.manager [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Took 0.70 seconds to deallocate network for instance.#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.591 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.591 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.670 183134 DEBUG nova.compute.provider_tree [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.697 183134 DEBUG nova.scheduler.client.report [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.722 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.751 183134 INFO nova.scheduler.client.report [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Deleted allocations for instance 3c0278c5-e992-4ef6-bf28-99033c90ee64#033[00m
Jan 30 04:48:45 np0005601977 nova_compute[183130]: 2026-01-30 09:48:45.826 183134 DEBUG oslo_concurrency.lockutils [None req-915f7654-0927-4a44-b975-35f52d2bc419 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:45 np0005601977 podman[228188]: 2026-01-30 09:48:45.850729035 +0000 UTC m=+0.063588602 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:48:45 np0005601977 podman[228187]: 2026-01-30 09:48:45.851180238 +0000 UTC m=+0.065210488 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, release=1769056855)
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.963 183134 DEBUG nova.compute.manager [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.963 183134 DEBUG oslo_concurrency.lockutils [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.964 183134 DEBUG oslo_concurrency.lockutils [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.964 183134 DEBUG oslo_concurrency.lockutils [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "3c0278c5-e992-4ef6-bf28-99033c90ee64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.965 183134 DEBUG nova.compute.manager [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] No waiting events found dispatching network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.965 183134 WARNING nova.compute.manager [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received unexpected event network-vif-plugged-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:48:46 np0005601977 nova_compute[183130]: 2026-01-30 09:48:46.966 183134 DEBUG nova.compute.manager [req-f04f6e5b-6028-4925-8497-83140c2857a2 req-73c5144e-a9a6-418a-992e-e5f0f40d9760 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Received event network-vif-deleted-f52a88e4-1cf4-4d4d-88d5-7728d9ffcd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:48 np0005601977 nova_compute[183130]: 2026-01-30 09:48:48.052 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:48Z|00503|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:48:48 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:48Z|00504|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:48:48 np0005601977 nova_compute[183130]: 2026-01-30 09:48:48.649 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:49 np0005601977 nova_compute[183130]: 2026-01-30 09:48:49.589 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:50 np0005601977 podman[228230]: 2026-01-30 09:48:50.841293068 +0000 UTC m=+0.061240785 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 30 04:48:50 np0005601977 podman[228231]: 2026-01-30 09:48:50.874273473 +0000 UTC m=+0.088586628 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:48:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:51.143 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:53 np0005601977 nova_compute[183130]: 2026-01-30 09:48:53.054 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:53Z|00505|binding|INFO|Releasing lport b9267414-60b5-4998-b9cd-8b1c6a718595 from this chassis (sb_readonly=0)
Jan 30 04:48:53 np0005601977 ovn_controller[95460]: 2026-01-30T09:48:53Z|00506|binding|INFO|Releasing lport d6e15ff1-8451-4134-9247-4d8c23ead538 from this chassis (sb_readonly=0)
Jan 30 04:48:53 np0005601977 nova_compute[183130]: 2026-01-30 09:48:53.820 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:54 np0005601977 nova_compute[183130]: 2026-01-30 09:48:54.590 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:57.402 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:57.402 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:48:57.403 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:57 np0005601977 podman[228275]: 2026-01-30 09:48:57.910330485 +0000 UTC m=+0.120689138 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:48:58 np0005601977 nova_compute[183130]: 2026-01-30 09:48:58.056 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:59 np0005601977 nova_compute[183130]: 2026-01-30 09:48:59.503 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766524.502408, 3c0278c5-e992-4ef6-bf28-99033c90ee64 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:48:59 np0005601977 nova_compute[183130]: 2026-01-30 09:48:59.503 183134 INFO nova.compute.manager [-] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:48:59 np0005601977 nova_compute[183130]: 2026-01-30 09:48:59.592 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:48:59 np0005601977 nova_compute[183130]: 2026-01-30 09:48:59.729 183134 DEBUG nova.compute.manager [None req-efbd101f-da4f-46c2-b0d0-6c8b7e4b5ccc - - - - - -] [instance: 3c0278c5-e992-4ef6-bf28-99033c90ee64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:49:03 np0005601977 nova_compute[183130]: 2026-01-30 09:49:03.057 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:04 np0005601977 nova_compute[183130]: 2026-01-30 09:49:04.594 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:06 np0005601977 podman[228300]: 2026-01-30 09:49:06.834061538 +0000 UTC m=+0.052081332 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:49:08 np0005601977 nova_compute[183130]: 2026-01-30 09:49:08.060 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:09 np0005601977 nova_compute[183130]: 2026-01-30 09:49:09.595 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.346 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.376 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.377 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.378 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.378 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.450 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.505 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.507 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.552 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.717 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.718 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5468MB free_disk=73.21945571899414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.718 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.719 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.801 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.802 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.802 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.848 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.864 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.885 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:49:11 np0005601977 nova_compute[183130]: 2026-01-30 09:49:11.885 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:13 np0005601977 nova_compute[183130]: 2026-01-30 09:49:13.063 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:14 np0005601977 nova_compute[183130]: 2026-01-30 09:49:14.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:16 np0005601977 podman[228331]: 2026-01-30 09:49:16.844228417 +0000 UTC m=+0.062373327 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:49:16 np0005601977 podman[228332]: 2026-01-30 09:49:16.845155804 +0000 UTC m=+0.061813152 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:49:18 np0005601977 nova_compute[183130]: 2026-01-30 09:49:18.066 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:19 np0005601977 nova_compute[183130]: 2026-01-30 09:49:19.602 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:21 np0005601977 podman[228377]: 2026-01-30 09:49:21.845401291 +0000 UTC m=+0.052032491 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:49:21 np0005601977 podman[228376]: 2026-01-30 09:49:21.864728364 +0000 UTC m=+0.075959076 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:49:21 np0005601977 nova_compute[183130]: 2026-01-30 09:49:21.882 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:22 np0005601977 nova_compute[183130]: 2026-01-30 09:49:22.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:22 np0005601977 nova_compute[183130]: 2026-01-30 09:49:22.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:49:23 np0005601977 nova_compute[183130]: 2026-01-30 09:49:23.067 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:23 np0005601977 nova_compute[183130]: 2026-01-30 09:49:23.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:23 np0005601977 nova_compute[183130]: 2026-01-30 09:49:23.345 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:24 np0005601977 nova_compute[183130]: 2026-01-30 09:49:24.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:24 np0005601977 nova_compute[183130]: 2026-01-30 09:49:24.604 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:24 np0005601977 ovn_controller[95460]: 2026-01-30T09:49:24Z|00507|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Jan 30 04:49:25 np0005601977 nova_compute[183130]: 2026-01-30 09:49:25.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601977 nova_compute[183130]: 2026-01-30 09:49:26.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601977 nova_compute[183130]: 2026-01-30 09:49:26.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:49:26 np0005601977 nova_compute[183130]: 2026-01-30 09:49:26.365 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:49:28 np0005601977 nova_compute[183130]: 2026-01-30 09:49:28.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:28 np0005601977 podman[228418]: 2026-01-30 09:49:28.8641531 +0000 UTC m=+0.083278166 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:49:29 np0005601977 nova_compute[183130]: 2026-01-30 09:49:29.605 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:30 np0005601977 nova_compute[183130]: 2026-01-30 09:49:30.360 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:31 np0005601977 nova_compute[183130]: 2026-01-30 09:49:31.363 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:33 np0005601977 nova_compute[183130]: 2026-01-30 09:49:33.072 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:34 np0005601977 nova_compute[183130]: 2026-01-30 09:49:34.606 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:37 np0005601977 podman[228444]: 2026-01-30 09:49:37.847268464 +0000 UTC m=+0.059146075 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:49:38 np0005601977 nova_compute[183130]: 2026-01-30 09:49:38.074 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:39 np0005601977 nova_compute[183130]: 2026-01-30 09:49:39.610 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:43 np0005601977 nova_compute[183130]: 2026-01-30 09:49:43.075 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:44 np0005601977 nova_compute[183130]: 2026-01-30 09:49:44.613 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:47 np0005601977 podman[228470]: 2026-01-30 09:49:47.848843764 +0000 UTC m=+0.058517867 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute)
Jan 30 04:49:47 np0005601977 podman[228469]: 2026-01-30 09:49:47.86199391 +0000 UTC m=+0.068586405 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 30 04:49:48 np0005601977 nova_compute[183130]: 2026-01-30 09:49:48.077 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:49 np0005601977 nova_compute[183130]: 2026-01-30 09:49:49.614 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:52 np0005601977 podman[228512]: 2026-01-30 09:49:52.8532005 +0000 UTC m=+0.061088281 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:49:52 np0005601977 podman[228513]: 2026-01-30 09:49:52.853308503 +0000 UTC m=+0.055316246 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:49:53 np0005601977 nova_compute[183130]: 2026-01-30 09:49:53.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:54 np0005601977 nova_compute[183130]: 2026-01-30 09:49:54.616 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.457 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'name': 'tempest-TestGettingAddress-server-320003657', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'hostId': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.473 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.474 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85faf035-c21b-4376-bb67-60de31998fa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.458397', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '08172c62-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': 'c3678a9339be5f2e88a56812e206f36490da20deb90cc743995e27300c4d565f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.458397', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '08174242-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': 'cc5bb64c51743a027eaf39e30b479a9e2f203cb1e5937e30f1fcb3a149edda93'}]}, 'timestamp': '2026-01-30 09:49:55.474544', '_unique_id': 'faff3384e46a45568f14bb79236ebc53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.477 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.496 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/cpu volume: 12340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '280985b5-1a9b-4b6e-811a-803dd1507cde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12340000000, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:49:55.478058', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '081aa9f0-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.89095674, 'message_signature': '8848964757ec0ef3e49d41d88ed0007df18f9741c04da83ae6486fbe31be658a'}]}, 'timestamp': '2026-01-30 09:49:55.496884', '_unique_id': 'ad1cf592555d4d95af27422c2cb16c14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.499 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.499 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/memory.usage volume: 43.90234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0fe277e-435e-4fd7-b22a-ac3d2f1ae9c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.90234375, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'timestamp': '2026-01-30T09:49:55.499688', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '081b2fb0-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.89095674, 'message_signature': 'e02768e7e0ff59dacb5610186ddd9911701930187054bc715c9508a7e5f01f21'}]}, 'timestamp': '2026-01-30 09:49:55.500325', '_unique_id': '784a82ee1ef645889d3a650936763355'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.506 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 23736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.507 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes.delta volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3707ae1-bf7f-42d1-9e5c-d6ace83d228b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 23736, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.502606', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '081c4cd8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '506836c5f05cb34352b70c0e5ff2a976a53f3e10d6e68703416072340aac1fdf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.502606', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '081c6114-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '55cd2345b4db700ffbe4f04ac8770e288229cf71223f96606f5b3650f69109dd'}]}, 'timestamp': '2026-01-30 09:49:55.508089', '_unique_id': 'dfcd246588944d7aa8eca72576d3db2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.510 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 27126 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.511 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.bytes volume: 5040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fa1c2e5-3707-4846-9b3e-24be62b92a59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27126, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.510646', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '081cdbc6-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'c17fa9d790efc660c4ef0df0f5878da0f11194d3b503a7fbe424dc8a4ed0894a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5040, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.510646', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '081cf0e8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '9cfc8f968ee805f998de01bf4725dcada8ca9e4bb78f711450b48ca12aaf65e5'}]}, 'timestamp': '2026-01-30 09:49:55.511769', '_unique_id': 'c29ef6cdccb7450b9d79b584e9b5166d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.548 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 73187328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.549 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9226ed99-3491-4651-94b9-eece61d65ea9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73187328, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.514077', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '08229e30-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '6dd8b6e0132a9b2dfdc55adab5ce8057b239a8389debf7c1913638934089138a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.514077', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0822b2da-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '8c7b74a787f2ee1ff4590325b26a53017645b2f4ffcc8041fd7d45af2337eb38'}]}, 'timestamp': '2026-01-30 09:49:55.549482', '_unique_id': 'b615a9126c0f4f4cab17913e761425ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.552 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 172 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.552 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets volume: 36 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c634e04-f13b-4fe4-b91b-20bb7d7770d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 172, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.552239', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '0823320a-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '1faaa0dc5120891e42f2d343bdf9b5fb8c97cece28edcdee332ae12134e921fe'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 36, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.552239', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '0823438a-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'd7dcf1bdf394ec9ba1717c4a7ea58aa5da9dcafdcb03f88de0d4c8e05c73acf7'}]}, 'timestamp': '2026-01-30 09:49:55.553208', '_unique_id': 'fa1d3b11592543d9bc67233422c48864'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.555 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 494706335 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.555 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.latency volume: 43086902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6a1dd0d-30b6-4c3c-a885-fb76a652d196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 494706335, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.555403', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0823abc2-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '21cbb047e1dbec391c10795238002e2b25ed4fb49a4131769d08c665172f2f8f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43086902, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.555403', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0823bbda-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '4bc7f312469be0a74642d5118c213f8f746bd82aa55e40d66844e8535b41b3c6'}]}, 'timestamp': '2026-01-30 09:49:55.556272', '_unique_id': 'e0b8808c93504fdcaea068087b933766'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.558 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 29489664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.559 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29608f57-9f6f-4ce9-83cc-d207a9af89bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29489664, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.558723', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '08242d40-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': 'b3a038983b582d6d55a75f526f94cb41bd492a63a91cc6f771073f0c367679fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.558723', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '08243ee8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '6a8882b15f10178ac7590d06b9fac2a594d5cf77d15632363279c91979518a67'}]}, 'timestamp': '2026-01-30 09:49:55.559602', '_unique_id': '4ee30454bfb94b7fb199b4ee2746fd6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.561 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.561 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 1069 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.562 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63941002-bc41-4b53-89cc-a4708f7107b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1069, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.561797', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0824a54a-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '4d6fdecb7c55875bb675a0672d17e6bc0bc9c901afab7352b6d1a9f7e055d687'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.561797', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0824b6f2-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '5f85badcedfeaef4582f3da864c963ef15b9f118cec658ffeb223cbffee46403'}]}, 'timestamp': '2026-01-30 09:49:55.562674', '_unique_id': 'd415614ccfb74601aecb01a9781add27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.565 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.565 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.565 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.565 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18881ead-af09-4432-b872-e1bd4817855e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.565404', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '08253348-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': '97ada928b17ee913b87c2450561a5d33c4944eac823376fea186bca3ab7656df'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.565404', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0825434c-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': 'a5752a5a512cc24152dd2f7cc4feacd27e4941f81905f2db479f1938a2d78ae5'}]}, 'timestamp': '2026-01-30 09:49:55.566302', '_unique_id': 'e69e925a9f8648e787cdc870c8203f82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.568 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.568 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 25180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.568 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes.delta volume: 644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebea694b-bc9b-482e-a1a0-6f8efc27bce4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 25180, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.568423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '0825a9a4-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'b2493a937c6b9bda6dbea413356f6f808687c799c5805a0ca3a1509baff402b7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 644, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.568423', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '0825ba7a-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'ae843ffbf06dc1fe8215d13fb554c1ede462f724ba0759abaa9dc112f76e078a'}]}, 'timestamp': '2026-01-30 09:49:55.569360', '_unique_id': 'c6e583bd86074cfd935be8483724b15e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.571 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.571 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.571 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b6b5c18-055d-49fc-adcb-8c8a8057375a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.571482', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '08261fba-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'bc015c3fb0bdf2bf29464066297b26d0a7e91e0c559311576a3b927a8d007377'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.571482', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '08263040-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'e75548863be9177c3e167824d7a48fb8282c47a4cfede113b003da871b1f9d23'}]}, 'timestamp': '2026-01-30 09:49:55.572372', '_unique_id': '766ebea281234de49d693eade2427ca0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.574 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.574 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.575 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c38b8c29-31db-4030-9ad2-034c3d6e4b87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.574795', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0826a0de-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': '0d400f0fce2a3c2e7447f10425b2668ae311325ca0b31e872354807726d38ed2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.574795', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0826b204-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.853309292, 'message_signature': '906eb5e99c47154df2e858f53dbb6d068af253bd0162648f603bbb352dd0f0b0'}]}, 'timestamp': '2026-01-30 09:49:55.575651', '_unique_id': '273050d957a24d439d704a57cf192af1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.577 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.577 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 29459 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.578 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.bytes volume: 2354 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddb11ffa-b425-4806-9cc7-293b22d7c2a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29459, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.577911', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '08271a78-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'f3f1d7a241a0dfb6e3074164958ccee53754cb610effb40042621f00f06449cb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2354, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.577911', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '08272cd4-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '56c2ed5c204e34963186d30f7ef58b537cacafb196bfb3e82e9e3263cf1240b7'}]}, 'timestamp': '2026-01-30 09:49:55.578812', '_unique_id': 'fd29528cc1764cff87356950221c512a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.581 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.581 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '290c4049-f3bc-41b1-b82f-9bbc2f313e9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 162, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.581015', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '0827996c-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '773a2489a42b5beccec012a06fa2270f8cc214ba70895326ab53fcf049f78015'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.581015', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '0827a560-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '60f3db5a9ac7b0466ac96de13d234a794144978821eff3ebbeb07c94ab117b7a'}]}, 'timestamp': '2026-01-30 09:49:55.581828', '_unique_id': 'a15fe045c2da406a9da5e7e44c9b9e19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.583 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.583 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c3be197-aa72-47ba-b7a4-6178bf52369f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 347, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.583327', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0827ebf6-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '56ca685de61ae90e02e6c211683a401ed7776b20126be66f693cc7ed490e171f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.583327', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0827f6be-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '9b0fa7612acbf6589838eeed0953a9f2ca6399103b2b9ea0aa3f13f5b0ed895d'}]}, 'timestamp': '2026-01-30 09:49:55.583899', '_unique_id': 'f64a87590eee44538efdfe9cf8e0c92b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.585 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.585 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679eace8-3e73-440c-b430-5e588e9cbb15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.585373', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '08283ba6-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '0d9c0b63cc1bfe07516be442ba47e9ddece5f8e83d5ccc44f301b7f179b066d3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.585373', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '0828470e-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '02751a61a2fd41f368e2e5706ed0a3933425f71ddb40d3de41d853105a104bfd'}]}, 'timestamp': '2026-01-30 09:49:55.585978', '_unique_id': '4a154c4f26bb45a3ae3538232483a93d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.587 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.587 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5947059f-b1bd-44c8-9b82-05f32012a2f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.587534', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '08289010-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': 'b609580e731d25e37ebbaa666454e36ccb9e34c5a3cba95f6b2f614b0e3eca8c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.587534', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '08289b82-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '1e766b8ff7834f84436d04b9db0a04aab04deb3c4356d75ae44b31ae24c722da'}]}, 'timestamp': '2026-01-30 09:49:55.588128', '_unique_id': '039bf2f340064097872dccc00684c599'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.589 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 1641749191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.589 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e82db78-806c-403c-a967-24fc32f32206', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1641749191, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-vda', 'timestamp': '2026-01-30T09:49:55.589662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0828e312-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': 'fa0f4241523160b9622e1178c7f5b1b9cc6d0ceaf11fbda8469b84335a1d89b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '468f5a89-b848-45a6-8649-d09040ab2a09-sda', 'timestamp': '2026-01-30T09:49:55.589662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'instance-0000002d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0828ede4-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.909063069, 'message_signature': '2ad78764efb9e225eabfa8f71a5bd3dfdd3a992d6ff8e36701fb7eb5c010b210'}]}, 'timestamp': '2026-01-30 09:49:55.590266', '_unique_id': '040b6729982a49f8b0ee0b744976e63a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.591 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.592 12 DEBUG ceilometer.compute.pollsters [-] 468f5a89-b848-45a6-8649-d09040ab2a09/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d361dfa-01f6-4b87-95a8-70c9e7a06d60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap57979c3b-0d', 'timestamp': '2026-01-30T09:49:55.591779', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap57979c3b-0d', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:cf:e2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap57979c3b-0d'}, 'message_id': '082935ec-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '13987e9c886db023e3f389ccbf40d5b8b7f795ba733285212ddb8d0e2dc5d1cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-0000002d-468f5a89-b848-45a6-8649-d09040ab2a09-tap4088bc52-1b', 'timestamp': '2026-01-30T09:49:55.591779', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-320003657', 'name': 'tap4088bc52-1b', 'instance_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:a9:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4088bc52-1b'}, 'message_id': '082942a8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5126.897549739, 'message_signature': '7c33690ac46ebf8efed5c27ab3b7d8d21bad6eb5367785a3aca091074d7aa00d'}]}, 'timestamp': '2026-01-30 09:49:55.592411', '_unique_id': '7ecb4a2e1c704b75bf963cf08c37beaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:49:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:49:55.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:49:57.404 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:49:57.405 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:49:57.406 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:58 np0005601977 nova_compute[183130]: 2026-01-30 09:49:58.082 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:59 np0005601977 nova_compute[183130]: 2026-01-30 09:49:59.618 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:49:59 np0005601977 podman[228557]: 2026-01-30 09:49:59.872799283 +0000 UTC m=+0.085925101 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:50:03 np0005601977 nova_compute[183130]: 2026-01-30 09:50:03.084 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:04 np0005601977 nova_compute[183130]: 2026-01-30 09:50:04.620 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:08 np0005601977 nova_compute[183130]: 2026-01-30 09:50:08.086 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:08 np0005601977 podman[228585]: 2026-01-30 09:50:08.832464634 +0000 UTC m=+0.047673966 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:50:09 np0005601977 nova_compute[183130]: 2026-01-30 09:50:09.621 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.088 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.441 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.442 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.442 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.443 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.610 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.669 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.670 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.726 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.890 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.891 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5471MB free_disk=73.21945190429688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.891 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.892 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.965 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.966 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:50:13 np0005601977 nova_compute[183130]: 2026-01-30 09:50:13.966 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:50:14 np0005601977 nova_compute[183130]: 2026-01-30 09:50:14.009 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:50:14 np0005601977 nova_compute[183130]: 2026-01-30 09:50:14.025 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:50:14 np0005601977 nova_compute[183130]: 2026-01-30 09:50:14.026 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:50:14 np0005601977 nova_compute[183130]: 2026-01-30 09:50:14.026 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:14 np0005601977 nova_compute[183130]: 2026-01-30 09:50:14.623 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:18 np0005601977 nova_compute[183130]: 2026-01-30 09:50:18.092 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:18 np0005601977 podman[228615]: 2026-01-30 09:50:18.831145312 +0000 UTC m=+0.049539979 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, managed_by=edpm_ansible, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855)
Jan 30 04:50:18 np0005601977 podman[228616]: 2026-01-30 09:50:18.840388297 +0000 UTC m=+0.049432387 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:50:19 np0005601977 nova_compute[183130]: 2026-01-30 09:50:19.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:23 np0005601977 nova_compute[183130]: 2026-01-30 09:50:23.028 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:23 np0005601977 nova_compute[183130]: 2026-01-30 09:50:23.093 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:23 np0005601977 nova_compute[183130]: 2026-01-30 09:50:23.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:23 np0005601977 podman[228655]: 2026-01-30 09:50:23.857076808 +0000 UTC m=+0.069513122 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:50:23 np0005601977 podman[228656]: 2026-01-30 09:50:23.857087018 +0000 UTC m=+0.066761063 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:50:24 np0005601977 nova_compute[183130]: 2026-01-30 09:50:24.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:24 np0005601977 nova_compute[183130]: 2026-01-30 09:50:24.342 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:50:24 np0005601977 nova_compute[183130]: 2026-01-30 09:50:24.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:25 np0005601977 nova_compute[183130]: 2026-01-30 09:50:25.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:25 np0005601977 nova_compute[183130]: 2026-01-30 09:50:25.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.726 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.726 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.727 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:50:26 np0005601977 nova_compute[183130]: 2026-01-30 09:50:26.727 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:50:28 np0005601977 nova_compute[183130]: 2026-01-30 09:50:28.095 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:29 np0005601977 nova_compute[183130]: 2026-01-30 09:50:29.627 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:29 np0005601977 nova_compute[183130]: 2026-01-30 09:50:29.793 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:50:29 np0005601977 nova_compute[183130]: 2026-01-30 09:50:29.817 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:50:29 np0005601977 nova_compute[183130]: 2026-01-30 09:50:29.818 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:50:29 np0005601977 nova_compute[183130]: 2026-01-30 09:50:29.819 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:30 np0005601977 podman[228694]: 2026-01-30 09:50:30.931287783 +0000 UTC m=+0.141578936 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:50:33 np0005601977 nova_compute[183130]: 2026-01-30 09:50:33.098 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:34 np0005601977 nova_compute[183130]: 2026-01-30 09:50:34.629 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:35 np0005601977 nova_compute[183130]: 2026-01-30 09:50:35.815 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:38 np0005601977 nova_compute[183130]: 2026-01-30 09:50:38.099 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:39 np0005601977 nova_compute[183130]: 2026-01-30 09:50:39.631 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:39 np0005601977 podman[228721]: 2026-01-30 09:50:39.838427292 +0000 UTC m=+0.055779028 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:50:43 np0005601977 nova_compute[183130]: 2026-01-30 09:50:43.101 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:44 np0005601977 nova_compute[183130]: 2026-01-30 09:50:44.633 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:48 np0005601977 nova_compute[183130]: 2026-01-30 09:50:48.104 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:49 np0005601977 nova_compute[183130]: 2026-01-30 09:50:49.634 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:49 np0005601977 podman[228745]: 2026-01-30 09:50:49.843069953 +0000 UTC m=+0.062855131 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Jan 30 04:50:49 np0005601977 podman[228746]: 2026-01-30 09:50:49.84716527 +0000 UTC m=+0.062830320 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:50:53 np0005601977 nova_compute[183130]: 2026-01-30 09:50:53.106 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:54 np0005601977 nova_compute[183130]: 2026-01-30 09:50:54.636 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:54 np0005601977 podman[228787]: 2026-01-30 09:50:54.831430353 +0000 UTC m=+0.049818248 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 30 04:50:54 np0005601977 podman[228788]: 2026-01-30 09:50:54.852012972 +0000 UTC m=+0.062132470 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:50:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:50:57.405 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:50:57.405 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:50:57.406 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:58 np0005601977 nova_compute[183130]: 2026-01-30 09:50:58.140 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:50:59 np0005601977 nova_compute[183130]: 2026-01-30 09:50:59.637 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:01 np0005601977 podman[228831]: 2026-01-30 09:51:01.88019533 +0000 UTC m=+0.098624126 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:51:03 np0005601977 nova_compute[183130]: 2026-01-30 09:51:03.174 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:04 np0005601977 nova_compute[183130]: 2026-01-30 09:51:04.638 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:08 np0005601977 nova_compute[183130]: 2026-01-30 09:51:08.217 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:09 np0005601977 nova_compute[183130]: 2026-01-30 09:51:09.640 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:10 np0005601977 podman[228857]: 2026-01-30 09:51:10.868359597 +0000 UTC m=+0.086159688 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:51:13 np0005601977 nova_compute[183130]: 2026-01-30 09:51:13.264 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:14 np0005601977 nova_compute[183130]: 2026-01-30 09:51:14.642 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.409 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.410 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.411 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.411 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.567 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.644 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.646 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.720 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.906 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.908 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5469MB free_disk=73.21955490112305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.908 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:15 np0005601977 nova_compute[183130]: 2026-01-30 09:51:15.908 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.103 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 468f5a89-b848-45a6-8649-d09040ab2a09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.104 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.105 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.179 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.242 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.245 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:51:16 np0005601977 nova_compute[183130]: 2026-01-30 09:51:16.245 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:18 np0005601977 nova_compute[183130]: 2026-01-30 09:51:18.266 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:19 np0005601977 nova_compute[183130]: 2026-01-30 09:51:19.644 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:19.692 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:51:19 np0005601977 nova_compute[183130]: 2026-01-30 09:51:19.692 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:19.694 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:51:20 np0005601977 podman[228887]: 2026-01-30 09:51:20.845637313 +0000 UTC m=+0.056276733 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:51:20 np0005601977 podman[228886]: 2026-01-30 09:51:20.867092777 +0000 UTC m=+0.080667391 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 30 04:51:23 np0005601977 nova_compute[183130]: 2026-01-30 09:51:23.268 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:24 np0005601977 nova_compute[183130]: 2026-01-30 09:51:24.246 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:24 np0005601977 nova_compute[183130]: 2026-01-30 09:51:24.646 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:24 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:24.695 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:51:25 np0005601977 nova_compute[183130]: 2026-01-30 09:51:25.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:25 np0005601977 nova_compute[183130]: 2026-01-30 09:51:25.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:25 np0005601977 podman[228926]: 2026-01-30 09:51:25.846942863 +0000 UTC m=+0.056008025 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:51:25 np0005601977 podman[228925]: 2026-01-30 09:51:25.854552531 +0000 UTC m=+0.066447064 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 30 04:51:26 np0005601977 nova_compute[183130]: 2026-01-30 09:51:26.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:26 np0005601977 nova_compute[183130]: 2026-01-30 09:51:26.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:26 np0005601977 nova_compute[183130]: 2026-01-30 09:51:26.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.206 183134 DEBUG nova.compute.manager [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.207 183134 DEBUG nova.compute.manager [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing instance network info cache due to event network-changed-57979c3b-0d30-474f-b9e6-16ccca270fbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.208 183134 DEBUG oslo_concurrency.lockutils [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.209 183134 DEBUG oslo_concurrency.lockutils [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.209 183134 DEBUG nova.network.neutron [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Refreshing network info cache for port 57979c3b-0d30-474f-b9e6-16ccca270fbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.408 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.409 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.409 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.409 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.410 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.412 183134 INFO nova.compute.manager [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Terminating instance#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.413 183134 DEBUG nova.compute.manager [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:51:27 np0005601977 kernel: tap57979c3b-0d (unregistering): left promiscuous mode
Jan 30 04:51:27 np0005601977 NetworkManager[55565]: <info>  [1769766687.4448] device (tap57979c3b-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.451 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00508|binding|INFO|Releasing lport 57979c3b-0d30-474f-b9e6-16ccca270fbf from this chassis (sb_readonly=0)
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00509|binding|INFO|Setting lport 57979c3b-0d30-474f-b9e6-16ccca270fbf down in Southbound
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00510|binding|INFO|Removing iface tap57979c3b-0d ovn-installed in OVS
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.457 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.460 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 kernel: tap4088bc52-1b (unregistering): left promiscuous mode
Jan 30 04:51:27 np0005601977 NetworkManager[55565]: <info>  [1769766687.4674] device (tap4088bc52-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.468 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:cf:e2 10.100.0.12'], port_security=['fa:16:3e:45:cf:e2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-718822f1-f31b-43f7-81ad-7c257e53efa2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '734be60e-cf29-48be-bfcc-a2e866fbc7f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da6afba3-4dcc-4b99-bb94-fa9ef8e0909a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=57979c3b-0d30-474f-b9e6-16ccca270fbf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.469 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 57979c3b-0d30-474f-b9e6-16ccca270fbf in datapath 718822f1-f31b-43f7-81ad-7c257e53efa2 unbound from our chassis#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.470 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.470 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 718822f1-f31b-43f7-81ad-7c257e53efa2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.473 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[432453c5-1af2-4bb9-9d67-1927323fb7b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.474 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2 namespace which is not needed anymore#033[00m
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00511|binding|INFO|Releasing lport 4088bc52-1be0-4b2d-91f5-7a4615232b92 from this chassis (sb_readonly=0)
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00512|binding|INFO|Setting lport 4088bc52-1be0-4b2d-91f5-7a4615232b92 down in Southbound
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.481 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:51:27Z|00513|binding|INFO|Removing iface tap4088bc52-1b ovn-installed in OVS
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.484 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.489 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.499 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:a9:16 2001:db8::f816:3eff:fe46:a916'], port_security=['fa:16:3e:46:a9:16 2001:db8::f816:3eff:fe46:a916'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe46:a916/64', 'neutron:device_id': '468f5a89-b848-45a6-8649-d09040ab2a09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '734be60e-cf29-48be-bfcc-a2e866fbc7f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=323b9e02-2128-4b84-8cbe-de6525d1728d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=4088bc52-1be0-4b2d-91f5-7a4615232b92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:51:27 np0005601977 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 30 04:51:27 np0005601977 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000002d.scope: Consumed 28.677s CPU time.
Jan 30 04:51:27 np0005601977 systemd-machined[154431]: Machine qemu-37-instance-0000002d terminated.
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [NOTICE]   (226356) : haproxy version is 2.8.14-c23fe91
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [NOTICE]   (226356) : path to executable is /usr/sbin/haproxy
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [WARNING]  (226356) : Exiting Master process...
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [ALERT]    (226356) : Current worker (226358) exited with code 143 (Terminated)
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2[226352]: [WARNING]  (226356) : All workers exited. Exiting... (0)
Jan 30 04:51:27 np0005601977 systemd[1]: libpod-844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60.scope: Deactivated successfully.
Jan 30 04:51:27 np0005601977 podman[228999]: 2026-01-30 09:51:27.626398374 +0000 UTC m=+0.070247332 container died 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:51:27 np0005601977 NetworkManager[55565]: <info>  [1769766687.6428] manager: (tap4088bc52-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 30 04:51:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60-userdata-shm.mount: Deactivated successfully.
Jan 30 04:51:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay-9c04f847a71c873f5657082cc6afeb02521acfc38de06995fe48efc99b0bfac1-merged.mount: Deactivated successfully.
Jan 30 04:51:27 np0005601977 podman[228999]: 2026-01-30 09:51:27.661997744 +0000 UTC m=+0.105846702 container cleanup 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:51:27 np0005601977 systemd[1]: libpod-conmon-844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60.scope: Deactivated successfully.
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.684 183134 INFO nova.virt.libvirt.driver [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Instance destroyed successfully.#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.685 183134 DEBUG nova.objects.instance [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 468f5a89-b848-45a6-8649-d09040ab2a09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:51:27 np0005601977 podman[229050]: 2026-01-30 09:51:27.715945889 +0000 UTC m=+0.035451046 container remove 844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.719 183134 DEBUG nova.virt.libvirt.vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:44:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:44:52Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.719 183134 DEBUG nova.network.os_vif_util [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.720 183134 DEBUG nova.network.os_vif_util [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.721 183134 DEBUG os_vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.721 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7b9b7b-cd24-40aa-9334-cc44dc5fa13d]: (4, ('Fri Jan 30 09:51:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2 (844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60)\n844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60\nFri Jan 30 09:51:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2 (844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60)\n844d6b948c68d8f03e47e406f73a0d1f7aef40e870c21aada1346b2e045e9c60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.722 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.723 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57979c3b-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.723 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ac11e421-27f6-4135-9a5c-0f21f1369c91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.724 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap718822f1-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.724 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.727 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.732 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 kernel: tap718822f1-f0: left promiscuous mode
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.735 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.738 183134 INFO os_vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=57979c3b-0d30-474f-b9e6-16ccca270fbf,network=Network(718822f1-f31b-43f7-81ad-7c257e53efa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57979c3b-0d')#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.738 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0da5bd-f05e-4d9d-9be0-07db754454a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.739 183134 DEBUG nova.virt.libvirt.vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:44:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-320003657',display_name='tempest-TestGettingAddress-server-320003657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-320003657',id=45,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbpDVEriN/G87e23tjDUdYO6W+QEtQ+9v/1x/s2NFcnEBKJ6j0PSOvCUNixlnCRe42XdVWeKCW4XyJVokJAQfPGbHlRcdfispH8A6tY+5GgFZvp7MZgsLRLxCewdLADyQ==',key_name='tempest-TestGettingAddress-1604136189',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:44:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gto1nwdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:44:52Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=468f5a89-b848-45a6-8649-d09040ab2a09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.739 183134 DEBUG nova.network.os_vif_util [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.740 183134 DEBUG nova.network.os_vif_util [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.741 183134 DEBUG os_vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.742 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.743 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4088bc52-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.746 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.749 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.750 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f30decf9-ba54-4992-8dda-20daeeb03743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.751 183134 INFO os_vif [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:a9:16,bridge_name='br-int',has_traffic_filtering=True,id=4088bc52-1be0-4b2d-91f5-7a4615232b92,network=Network(d8a742aa-08a4-4990-8e09-fbcff59d9bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4088bc52-1b')#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.751 183134 INFO nova.virt.libvirt.driver [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Deleting instance files /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09_del#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.752 183134 INFO nova.virt.libvirt.driver [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Deletion of /var/lib/nova/instances/468f5a89-b848-45a6-8649-d09040ab2a09_del complete#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.752 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf7a5cc-913f-44e8-8793-a80f3b7e84cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.767 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[25e81355-93b2-4dbe-9c25-cef9bec5c249]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482285, 'reachable_time': 42795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229071, 'error': None, 'target': 'ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.771 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-718822f1-f31b-43f7-81ad-7c257e53efa2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.771 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[6f069b31-b2d8-4efd-b5b6-6e93bfcd7c38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 systemd[1]: run-netns-ovnmeta\x2d718822f1\x2df31b\x2d43f7\x2d81ad\x2d7c257e53efa2.mount: Deactivated successfully.
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.772 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 4088bc52-1be0-4b2d-91f5-7a4615232b92 in datapath d8a742aa-08a4-4990-8e09-fbcff59d9bd9 unbound from our chassis#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.774 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8a742aa-08a4-4990-8e09-fbcff59d9bd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.775 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9ed68c-c81e-46bf-9cac-22011cfe79a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:27 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:27.775 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9 namespace which is not needed anymore#033[00m
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [NOTICE]   (226436) : haproxy version is 2.8.14-c23fe91
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [NOTICE]   (226436) : path to executable is /usr/sbin/haproxy
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [WARNING]  (226436) : Exiting Master process...
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [ALERT]    (226436) : Current worker (226438) exited with code 143 (Terminated)
Jan 30 04:51:27 np0005601977 neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9[226432]: [WARNING]  (226436) : All workers exited. Exiting... (0)
Jan 30 04:51:27 np0005601977 systemd[1]: libpod-f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29.scope: Deactivated successfully.
Jan 30 04:51:27 np0005601977 podman[229087]: 2026-01-30 09:51:27.897778936 +0000 UTC m=+0.047049448 container died f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:51:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29-userdata-shm.mount: Deactivated successfully.
Jan 30 04:51:27 np0005601977 systemd[1]: var-lib-containers-storage-overlay-eb8fd1413ef152a32d8424edf21c0f5217ec02708cd30b904b6bafde678d9f25-merged.mount: Deactivated successfully.
Jan 30 04:51:27 np0005601977 podman[229087]: 2026-01-30 09:51:27.932732347 +0000 UTC m=+0.082002899 container cleanup f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:51:27 np0005601977 systemd[1]: libpod-conmon-f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29.scope: Deactivated successfully.
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.989 183134 INFO nova.compute.manager [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.990 183134 DEBUG oslo.service.loopingcall [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.990 183134 DEBUG nova.compute.manager [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:51:27 np0005601977 nova_compute[183130]: 2026-01-30 09:51:27.991 183134 DEBUG nova.network.neutron [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:51:28 np0005601977 podman[229118]: 2026-01-30 09:51:28.00475202 +0000 UTC m=+0.053204155 container remove f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.008 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d849e590-252b-46fc-88ee-fc466c9ab92f]: (4, ('Fri Jan 30 09:51:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9 (f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29)\nf0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29\nFri Jan 30 09:51:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9 (f0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29)\nf0c0a0857d6fb7576bd2e4a652056781ab81fadbcd937f13341609c4e1a02b29\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.010 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2a45796c-db9e-4fd6-ba32-078833e6e438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.011 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a742aa-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:51:28 np0005601977 kernel: tapd8a742aa-00: left promiscuous mode
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.016 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.018 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[17d1d577-2650-41f4-b016-a298b1ff5cc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.019 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.032 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb88c1f-6788-46ec-92be-ffe87b10de7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.033 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc81183-dc57-4329-8f17-e1bd7ac1bec6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.044 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca0d7db-1d37-4db4-96d8-c7baa094327d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482358, 'reachable_time': 41806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229133, 'error': None, 'target': 'ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.046 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8a742aa-08a4-4990-8e09-fbcff59d9bd9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:51:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:28.046 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[3a60a4d9-df67-491f-bb56-1ea4bf74e715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.198 183134 DEBUG nova.compute.manager [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-unplugged-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.199 183134 DEBUG oslo_concurrency.lockutils [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.199 183134 DEBUG oslo_concurrency.lockutils [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.199 183134 DEBUG oslo_concurrency.lockutils [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.199 183134 DEBUG nova.compute.manager [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-unplugged-57979c3b-0d30-474f-b9e6-16ccca270fbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.200 183134 DEBUG nova.compute.manager [req-43ecf737-3709-4522-98d2-fcdf6cc3dc15 req-2d65db5c-fdcf-4fe4-b878-6a353bba4a4c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-unplugged-57979c3b-0d30-474f-b9e6-16ccca270fbf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.307 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.472 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.472 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:51:28 np0005601977 systemd[1]: run-netns-ovnmeta\x2dd8a742aa\x2d08a4\x2d4990\x2d8e09\x2dfbcff59d9bd9.mount: Deactivated successfully.
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.829 183134 DEBUG nova.network.neutron [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updated VIF entry in instance network info cache for port 57979c3b-0d30-474f-b9e6-16ccca270fbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.830 183134 DEBUG nova.network.neutron [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "address": "fa:16:3e:46:a9:16", "network": {"id": "d8a742aa-08a4-4990-8e09-fbcff59d9bd9", "bridge": "br-int", "label": "tempest-network-smoke--1807701852", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:a916", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4088bc52-1b", "ovs_interfaceid": "4088bc52-1be0-4b2d-91f5-7a4615232b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:51:28 np0005601977 nova_compute[183130]: 2026-01-30 09:51:28.919 183134 DEBUG oslo_concurrency.lockutils [req-cdb57269-9085-48ed-b14d-6856c2ea4a73 req-df87bf8b-2fff-41ca-9aa8-8b77a3f6253b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-468f5a89-b848-45a6-8649-d09040ab2a09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.286 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-unplugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.286 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.286 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.287 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.287 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-unplugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.287 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-unplugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.287 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.288 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.288 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.288 183134 DEBUG oslo_concurrency.lockutils [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.288 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.288 183134 WARNING nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received unexpected event network-vif-plugged-4088bc52-1be0-4b2d-91f5-7a4615232b92 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.289 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-deleted-4088bc52-1be0-4b2d-91f5-7a4615232b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.289 183134 INFO nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Neutron deleted interface 4088bc52-1be0-4b2d-91f5-7a4615232b92; detaching it from the instance and deleting it from the info cache#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.289 183134 DEBUG nova.network.neutron [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [{"id": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "address": "fa:16:3e:45:cf:e2", "network": {"id": "718822f1-f31b-43f7-81ad-7c257e53efa2", "bridge": "br-int", "label": "tempest-network-smoke--926548934", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57979c3b-0d", "ovs_interfaceid": "57979c3b-0d30-474f-b9e6-16ccca270fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:51:29 np0005601977 nova_compute[183130]: 2026-01-30 09:51:29.398 183134 DEBUG nova.compute.manager [req-bdba0957-fed3-4896-83ff-be1db3b6072c req-4dffbdd6-71bd-4058-9c8f-e17bcfaaed83 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Detach interface failed, port_id=4088bc52-1be0-4b2d-91f5-7a4615232b92, reason: Instance 468f5a89-b848-45a6-8649-d09040ab2a09 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.137 183134 DEBUG nova.network.neutron [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.315 183134 DEBUG nova.compute.manager [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.315 183134 DEBUG oslo_concurrency.lockutils [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.316 183134 DEBUG oslo_concurrency.lockutils [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.316 183134 DEBUG oslo_concurrency.lockutils [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.316 183134 DEBUG nova.compute.manager [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] No waiting events found dispatching network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.317 183134 WARNING nova.compute.manager [req-ffc44788-1659-490d-aa53-a0f8564ac324 req-8f036e70-1837-4534-813c-b920e082ed93 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received unexpected event network-vif-plugged-57979c3b-0d30-474f-b9e6-16ccca270fbf for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.330 183134 INFO nova.compute.manager [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Took 2.34 seconds to deallocate network for instance.#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.458 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.459 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.469 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.516 183134 DEBUG nova.compute.provider_tree [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.740 183134 DEBUG nova.scheduler.client.report [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.769 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.801 183134 INFO nova.scheduler.client.report [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 468f5a89-b848-45a6-8649-d09040ab2a09#033[00m
Jan 30 04:51:30 np0005601977 nova_compute[183130]: 2026-01-30 09:51:30.887 183134 DEBUG oslo_concurrency.lockutils [None req-b1c6d89a-d101-4f19-989c-da74bd99ac19 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "468f5a89-b848-45a6-8649-d09040ab2a09" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:31 np0005601977 nova_compute[183130]: 2026-01-30 09:51:31.361 183134 DEBUG nova.compute.manager [req-869738de-a93a-4872-b140-2de686fdd2ba req-89fb5bf3-c35b-4fcf-88fe-255b6f542284 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Received event network-vif-deleted-57979c3b-0d30-474f-b9e6-16ccca270fbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:51:32 np0005601977 nova_compute[183130]: 2026-01-30 09:51:32.744 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:32 np0005601977 podman[229135]: 2026-01-30 09:51:32.855307863 +0000 UTC m=+0.077679026 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 30 04:51:33 np0005601977 nova_compute[183130]: 2026-01-30 09:51:33.310 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:33 np0005601977 nova_compute[183130]: 2026-01-30 09:51:33.480 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:37 np0005601977 nova_compute[183130]: 2026-01-30 09:51:37.748 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:38 np0005601977 nova_compute[183130]: 2026-01-30 09:51:38.345 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:40 np0005601977 nova_compute[183130]: 2026-01-30 09:51:40.913 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:40 np0005601977 nova_compute[183130]: 2026-01-30 09:51:40.935 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:41 np0005601977 podman[229164]: 2026-01-30 09:51:41.885264276 +0000 UTC m=+0.078618702 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:51:42 np0005601977 nova_compute[183130]: 2026-01-30 09:51:42.682 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766687.6818926, 468f5a89-b848-45a6-8649-d09040ab2a09 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:51:42 np0005601977 nova_compute[183130]: 2026-01-30 09:51:42.683 183134 INFO nova.compute.manager [-] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:51:42 np0005601977 nova_compute[183130]: 2026-01-30 09:51:42.751 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:42 np0005601977 nova_compute[183130]: 2026-01-30 09:51:42.787 183134 DEBUG nova.compute.manager [None req-d116c9bb-d9dc-43a1-9db0-c28c8365e64f - - - - - -] [instance: 468f5a89-b848-45a6-8649-d09040ab2a09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:51:43 np0005601977 nova_compute[183130]: 2026-01-30 09:51:43.347 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:47 np0005601977 nova_compute[183130]: 2026-01-30 09:51:47.754 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:48 np0005601977 nova_compute[183130]: 2026-01-30 09:51:48.350 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:51 np0005601977 podman[229189]: 2026-01-30 09:51:51.854659855 +0000 UTC m=+0.068267266 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 30 04:51:51 np0005601977 podman[229190]: 2026-01-30 09:51:51.87507278 +0000 UTC m=+0.081572957 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:51:52 np0005601977 nova_compute[183130]: 2026-01-30 09:51:52.766 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:53 np0005601977 nova_compute[183130]: 2026-01-30 09:51:53.352 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:51:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:55.797 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:cf:f3 10.100.0.2 2001:db8::f816:3eff:fe0f:cff3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe0f:cff3/64', 'neutron:device_id': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c7609c2f-a62c-4f5c-ae1f-79d31dc0530f) old=Port_Binding(mac=['fa:16:3e:0f:cf:f3 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:51:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:55.798 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c7609c2f-a62c-4f5c-ae1f-79d31dc0530f in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 updated#033[00m
Jan 30 04:51:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:55.799 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0974bb4a-b27d-43c9-b594-a23be3309557, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:51:55 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:55.799 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[49a36e68-8611-4d24-9086-e42fd4c865aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:51:56 np0005601977 podman[229231]: 2026-01-30 09:51:56.839083104 +0000 UTC m=+0.056039676 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:51:56 np0005601977 podman[229230]: 2026-01-30 09:51:56.847638599 +0000 UTC m=+0.062759989 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:51:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:57.406 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:57.406 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:51:57.407 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:57 np0005601977 nova_compute[183130]: 2026-01-30 09:51:57.769 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:51:58 np0005601977 nova_compute[183130]: 2026-01-30 09:51:58.353 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:02 np0005601977 nova_compute[183130]: 2026-01-30 09:52:02.771 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:03 np0005601977 nova_compute[183130]: 2026-01-30 09:52:03.355 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:03 np0005601977 podman[229273]: 2026-01-30 09:52:03.870155964 +0000 UTC m=+0.089188225 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:52:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:04.938 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:cf:f3 10.100.0.2 2001:db8:0:1:f816:3eff:fe0f:cff3 2001:db8::f816:3eff:fe0f:cff3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe0f:cff3/64 2001:db8::f816:3eff:fe0f:cff3/64', 'neutron:device_id': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c7609c2f-a62c-4f5c-ae1f-79d31dc0530f) old=Port_Binding(mac=['fa:16:3e:0f:cf:f3 10.100.0.2 2001:db8::f816:3eff:fe0f:cff3'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe0f:cff3/64', 'neutron:device_id': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:52:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:04.940 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c7609c2f-a62c-4f5c-ae1f-79d31dc0530f in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 updated#033[00m
Jan 30 04:52:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:04.941 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0974bb4a-b27d-43c9-b594-a23be3309557, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:52:04 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:04.942 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[62042403-bb05-4bc1-a084-c735994aa948]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:07 np0005601977 nova_compute[183130]: 2026-01-30 09:52:07.828 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:08 np0005601977 nova_compute[183130]: 2026-01-30 09:52:08.358 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.470 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.471 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.484 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.567 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.567 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.574 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.574 183134 INFO nova.compute.claims [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.681 183134 DEBUG nova.compute.provider_tree [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.696 183134 DEBUG nova.scheduler.client.report [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.726 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.728 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.789 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.790 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.812 183134 INFO nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.829 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.833 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:52:12 np0005601977 podman[229299]: 2026-01-30 09:52:12.873076833 +0000 UTC m=+0.083835952 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.936 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.937 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.938 183134 INFO nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Creating image(s)#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.938 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.938 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.939 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:12 np0005601977 nova_compute[183130]: 2026-01-30 09:52:12.952 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.031 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.032 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.033 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.047 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.120 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.121 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.150 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.151 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.151 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.226 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.227 183134 DEBUG nova.virt.disk.api [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.228 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.281 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.283 183134 DEBUG nova.virt.disk.api [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.283 183134 DEBUG nova.objects.instance [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.328 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.328 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Ensure instance console log exists: /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.329 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.329 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.329 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.361 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:13 np0005601977 nova_compute[183130]: 2026-01-30 09:52:13.649 183134 DEBUG nova.policy [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:52:15 np0005601977 nova_compute[183130]: 2026-01-30 09:52:15.756 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Successfully created port: 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.580 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Successfully updated port: 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.598 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.598 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.599 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.697 183134 DEBUG nova.compute.manager [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.698 183134 DEBUG nova.compute.manager [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing instance network info cache due to event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.698 183134 DEBUG oslo_concurrency.lockutils [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:16 np0005601977 nova_compute[183130]: 2026-01-30 09:52:16.753 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.364 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.365 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.365 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.365 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.568 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.570 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.24826431274414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.570 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.571 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.642 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.644 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.644 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.692 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.711 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.740 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.741 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:17 np0005601977 nova_compute[183130]: 2026-01-30 09:52:17.832 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.216 183134 DEBUG nova.network.neutron [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updating instance_info_cache with network_info: [{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.237 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.237 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance network_info: |[{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.238 183134 DEBUG oslo_concurrency.lockutils [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.239 183134 DEBUG nova.network.neutron [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing network info cache for port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.244 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Start _get_guest_xml network_info=[{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.249 183134 WARNING nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.254 183134 DEBUG nova.virt.libvirt.host [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.255 183134 DEBUG nova.virt.libvirt.host [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.259 183134 DEBUG nova.virt.libvirt.host [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.260 183134 DEBUG nova.virt.libvirt.host [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.261 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.262 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.263 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.263 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.264 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.264 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.265 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.265 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.266 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.266 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.266 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.267 183134 DEBUG nova.virt.hardware [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.273 183134 DEBUG nova.virt.libvirt.vif [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:52:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1649993017',display_name='tempest-TestGettingAddress-server-1649993017',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1649993017',id=52,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gpyx1avo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:52:12Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.274 183134 DEBUG nova.network.os_vif_util [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.275 183134 DEBUG nova.network.os_vif_util [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.276 183134 DEBUG nova.objects.instance [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.310 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <uuid>b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4</uuid>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <name>instance-00000034</name>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-1649993017</nova:name>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:52:18</nova:creationTime>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        <nova:port uuid="7c0dcbc0-e8be-4d24-9ace-95ca83739c94">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe05:6a47" ipVersion="6"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe05:6a47" ipVersion="6"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="serial">b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="uuid">b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.config"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:05:6a:47"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <target dev="tap7c0dcbc0-e8"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/console.log" append="off"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:52:18 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:52:18 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:52:18 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:52:18 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.311 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Preparing to wait for external event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.311 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.311 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.312 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.312 183134 DEBUG nova.virt.libvirt.vif [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:52:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1649993017',display_name='tempest-TestGettingAddress-server-1649993017',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1649993017',id=52,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gpyx1avo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:52:12Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.313 183134 DEBUG nova.network.os_vif_util [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.314 183134 DEBUG nova.network.os_vif_util [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.314 183134 DEBUG os_vif [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.315 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.315 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.316 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.320 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.321 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c0dcbc0-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.322 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c0dcbc0-e8, col_values=(('external_ids', {'iface-id': '7c0dcbc0-e8be-4d24-9ace-95ca83739c94', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:6a:47', 'vm-uuid': 'b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.324 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 NetworkManager[55565]: <info>  [1769766738.3255] manager: (tap7c0dcbc0-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.327 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.332 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.333 183134 INFO os_vif [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8')#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.363 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.389 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.390 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.390 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:05:6a:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.391 183134 INFO nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Using config drive#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.857 183134 INFO nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Creating config drive at /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.config#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.863 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr2335tel execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:18 np0005601977 nova_compute[183130]: 2026-01-30 09:52:18.988 183134 DEBUG oslo_concurrency.processutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr2335tel" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:19 np0005601977 kernel: tap7c0dcbc0-e8: entered promiscuous mode
Jan 30 04:52:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:19Z|00514|binding|INFO|Claiming lport 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 for this chassis.
Jan 30 04:52:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:19Z|00515|binding|INFO|7c0dcbc0-e8be-4d24-9ace-95ca83739c94: Claiming fa:16:3e:05:6a:47 10.100.0.9 2001:db8:0:1:f816:3eff:fe05:6a47 2001:db8::f816:3eff:fe05:6a47
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.0531] manager: (tap7c0dcbc0-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.053 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.056 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.059 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.061 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.072 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:6a:47 10.100.0.9 2001:db8:0:1:f816:3eff:fe05:6a47 2001:db8::f816:3eff:fe05:6a47'], port_security=['fa:16:3e:05:6a:47 10.100.0.9 2001:db8:0:1:f816:3eff:fe05:6a47 2001:db8::f816:3eff:fe05:6a47'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe05:6a47/64 2001:db8::f816:3eff:fe05:6a47/64', 'neutron:device_id': 'b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '720a7e6b-119b-41e0-ac76-ea253cb891fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=7c0dcbc0-e8be-4d24-9ace-95ca83739c94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.074 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 bound to our chassis#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.077 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0974bb4a-b27d-43c9-b594-a23be3309557#033[00m
Jan 30 04:52:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:19Z|00516|binding|INFO|Setting lport 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 ovn-installed in OVS
Jan 30 04:52:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:19Z|00517|binding|INFO|Setting lport 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 up in Southbound
Jan 30 04:52:19 np0005601977 systemd-machined[154431]: New machine qemu-41-instance-00000034.
Jan 30 04:52:19 np0005601977 systemd-udevd[229360]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.084 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.090 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[68fb3a32-2c6e-4e05-ac0f-a95cb1a07e6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.091 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0974bb4a-b1 in ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.0939] device (tap7c0dcbc0-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.0948] device (tap7c0dcbc0-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.094 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0974bb4a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.094 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[9815a25d-cd2e-419a-b030-43e8bf551b07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.095 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[01cc5c63-17fe-49f7-8e40-92c63f137360]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 systemd[1]: Started Virtual Machine qemu-41-instance-00000034.
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.105 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[af09fb1b-9d45-4003-a0ce-dc805b220197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.116 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ada3f437-0c0d-4b46-8b02-c312f08b0aba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.138 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e4da3298-71b5-490d-ae0e-853e3b816ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.1448] manager: (tap0974bb4a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.144 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ba09b8-66f0-4532-87cc-2dc863b3ebc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 systemd-udevd[229363]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.170 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b29b7a23-3c66-41e4-9653-6949ff1ba4b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.174 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f7bf27-af64-4e2a-a930-518a586582f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.1922] device (tap0974bb4a-b0): carrier: link connected
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.195 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[be3c405b-8a71-47a4-921a-8975e547513b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.213 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[25e80a0d-326a-4215-a537-fdc6edeaec0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0974bb4a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:cf:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527053, 'reachable_time': 24030, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229393, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.227 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d9cc5ec0-38cd-4b18-b13d-2ab78b6804e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:cff3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527053, 'tstamp': 527053}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229394, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.243 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[04c9cc01-a29b-487e-b9e7-65c7d3b74646]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0974bb4a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:cf:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527053, 'reachable_time': 24030, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229395, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.265 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[904de2b7-0bdd-46b5-ae7e-f823449c8175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.322 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[008d14e7-264e-4f71-88d1-3b839378663b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.324 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0974bb4a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.324 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.324 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0974bb4a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.326 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 NetworkManager[55565]: <info>  [1769766739.3270] manager: (tap0974bb4a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 30 04:52:19 np0005601977 kernel: tap0974bb4a-b0: entered promiscuous mode
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.330 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0974bb4a-b0, col_values=(('external_ids', {'iface-id': 'c7609c2f-a62c-4f5c-ae1f-79d31dc0530f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:19 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:19Z|00518|binding|INFO|Releasing lport c7609c2f-a62c-4f5c-ae1f-79d31dc0530f from this chassis (sb_readonly=0)
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.332 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.335 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0974bb4a-b27d-43c9-b594-a23be3309557.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0974bb4a-b27d-43c9-b594-a23be3309557.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.336 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c1bf9df2-65a8-4adf-b11e-6c93529e7372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.337 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-0974bb4a-b27d-43c9-b594-a23be3309557
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/0974bb4a-b27d-43c9-b594-a23be3309557.pid.haproxy
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 0974bb4a-b27d-43c9-b594-a23be3309557
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.337 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:19 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:19.338 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'env', 'PROCESS_TAG=haproxy-0974bb4a-b27d-43c9-b594-a23be3309557', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0974bb4a-b27d-43c9-b594-a23be3309557.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.382 183134 DEBUG nova.compute.manager [req-8aaecd9b-faef-478f-af85-8cc9869bb051 req-81b3b09a-fc52-4e12-bc0e-f4b2422ca365 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.383 183134 DEBUG oslo_concurrency.lockutils [req-8aaecd9b-faef-478f-af85-8cc9869bb051 req-81b3b09a-fc52-4e12-bc0e-f4b2422ca365 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.383 183134 DEBUG oslo_concurrency.lockutils [req-8aaecd9b-faef-478f-af85-8cc9869bb051 req-81b3b09a-fc52-4e12-bc0e-f4b2422ca365 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.383 183134 DEBUG oslo_concurrency.lockutils [req-8aaecd9b-faef-478f-af85-8cc9869bb051 req-81b3b09a-fc52-4e12-bc0e-f4b2422ca365 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:19 np0005601977 nova_compute[183130]: 2026-01-30 09:52:19.384 183134 DEBUG nova.compute.manager [req-8aaecd9b-faef-478f-af85-8cc9869bb051 req-81b3b09a-fc52-4e12-bc0e-f4b2422ca365 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Processing event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:52:19 np0005601977 podman[229427]: 2026-01-30 09:52:19.681432017 +0000 UTC m=+0.057082806 container create 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:52:19 np0005601977 systemd[1]: Started libpod-conmon-1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7.scope.
Jan 30 04:52:19 np0005601977 podman[229427]: 2026-01-30 09:52:19.651305834 +0000 UTC m=+0.026956713 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:52:19 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:52:19 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2339da5761650c8d6a23365ccbc290a72cd73a127597099081e0b65a2753daa4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:52:19 np0005601977 podman[229427]: 2026-01-30 09:52:19.769583482 +0000 UTC m=+0.145234371 container init 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:52:19 np0005601977 podman[229427]: 2026-01-30 09:52:19.776616423 +0000 UTC m=+0.152267242 container start 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:52:19 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [NOTICE]   (229447) : New worker (229449) forked
Jan 30 04:52:19 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [NOTICE]   (229447) : Loading success.
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.285 183134 DEBUG nova.network.neutron [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updated VIF entry in instance network info cache for port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.286 183134 DEBUG nova.network.neutron [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updating instance_info_cache with network_info: [{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.310 183134 DEBUG oslo_concurrency.lockutils [req-83da4c25-4d27-4970-b613-df5d93e6c47b req-a09cd34e-91bd-4716-bcfd-c042abe5d8c8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.444 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766740.4435709, b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.445 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] VM Started (Lifecycle Event)#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.447 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.452 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.457 183134 INFO nova.virt.libvirt.driver [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance spawned successfully.#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.458 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.477 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.485 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.488 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.489 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.489 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.490 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.490 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.491 183134 DEBUG nova.virt.libvirt.driver [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.523 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.524 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766740.4448307, b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.524 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.556 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.563 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766740.4503381, b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.564 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.597 183134 INFO nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Took 7.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.597 183134 DEBUG nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.599 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.606 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.664 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.699 183134 INFO nova.compute.manager [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Took 8.16 seconds to build instance.#033[00m
Jan 30 04:52:20 np0005601977 nova_compute[183130]: 2026-01-30 09:52:20.716 183134 DEBUG oslo_concurrency.lockutils [None req-80e09f2e-7e05-474c-8623-a10b1313f839 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.471 183134 DEBUG nova.compute.manager [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.471 183134 DEBUG oslo_concurrency.lockutils [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.472 183134 DEBUG oslo_concurrency.lockutils [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.472 183134 DEBUG oslo_concurrency.lockutils [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.473 183134 DEBUG nova.compute.manager [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] No waiting events found dispatching network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:52:21 np0005601977 nova_compute[183130]: 2026-01-30 09:52:21.473 183134 WARNING nova.compute.manager [req-c228e4d5-3be4-4dfe-b4af-4b7583571543 req-2dffb696-6519-48ff-bcaa-2d692bb8c231 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received unexpected event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:52:22 np0005601977 podman[229466]: 2026-01-30 09:52:22.846845011 +0000 UTC m=+0.064485658 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:52:22 np0005601977 podman[229465]: 2026-01-30 09:52:22.855263352 +0000 UTC m=+0.074067022 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 30 04:52:23 np0005601977 nova_compute[183130]: 2026-01-30 09:52:23.324 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:23 np0005601977 nova_compute[183130]: 2026-01-30 09:52:23.363 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:23 np0005601977 nova_compute[183130]: 2026-01-30 09:52:23.741 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:25 np0005601977 nova_compute[183130]: 2026-01-30 09:52:25.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:26 np0005601977 nova_compute[183130]: 2026-01-30 09:52:26.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:26 np0005601977 nova_compute[183130]: 2026-01-30 09:52:26.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:26 np0005601977 nova_compute[183130]: 2026-01-30 09:52:26.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:52:26 np0005601977 nova_compute[183130]: 2026-01-30 09:52:26.844 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:26 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:26.848 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:52:26 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:26.853 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:52:27 np0005601977 nova_compute[183130]: 2026-01-30 09:52:27.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:27 np0005601977 nova_compute[183130]: 2026-01-30 09:52:27.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:27 np0005601977 NetworkManager[55565]: <info>  [1769766747.8363] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 30 04:52:27 np0005601977 nova_compute[183130]: 2026-01-30 09:52:27.835 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:27 np0005601977 NetworkManager[55565]: <info>  [1769766747.8380] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 30 04:52:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:27Z|00519|binding|INFO|Releasing lport c7609c2f-a62c-4f5c-ae1f-79d31dc0530f from this chassis (sb_readonly=0)
Jan 30 04:52:27 np0005601977 nova_compute[183130]: 2026-01-30 09:52:27.853 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:27 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:27Z|00520|binding|INFO|Releasing lport c7609c2f-a62c-4f5c-ae1f-79d31dc0530f from this chassis (sb_readonly=0)
Jan 30 04:52:27 np0005601977 nova_compute[183130]: 2026-01-30 09:52:27.859 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:27 np0005601977 podman[229507]: 2026-01-30 09:52:27.86104997 +0000 UTC m=+0.078093907 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:52:27 np0005601977 podman[229508]: 2026-01-30 09:52:27.863316555 +0000 UTC m=+0.072394124 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.077 183134 DEBUG nova.compute.manager [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.077 183134 DEBUG nova.compute.manager [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing instance network info cache due to event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.077 183134 DEBUG oslo_concurrency.lockutils [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.078 183134 DEBUG oslo_concurrency.lockutils [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.078 183134 DEBUG nova.network.neutron [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing network info cache for port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.327 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.361 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:28 np0005601977 nova_compute[183130]: 2026-01-30 09:52:28.364 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:29.860 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:30 np0005601977 nova_compute[183130]: 2026-01-30 09:52:30.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:30 np0005601977 nova_compute[183130]: 2026-01-30 09:52:30.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:52:30 np0005601977 nova_compute[183130]: 2026-01-30 09:52:30.345 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:52:30 np0005601977 nova_compute[183130]: 2026-01-30 09:52:30.662 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.683 183134 DEBUG nova.network.neutron [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updated VIF entry in instance network info cache for port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.684 183134 DEBUG nova.network.neutron [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updating instance_info_cache with network_info: [{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.700 183134 DEBUG oslo_concurrency.lockutils [req-99030a94-0e79-41b2-984c-769ba2261359 req-5e428bf3-3e5d-4ebf-991a-6acba88f77c1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.700 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.700 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:52:31 np0005601977 nova_compute[183130]: 2026-01-30 09:52:31.701 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:52:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:33Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:6a:47 10.100.0.9
Jan 30 04:52:33 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:33Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:6a:47 10.100.0.9
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.289 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updating instance_info_cache with network_info: [{"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.303 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.304 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.330 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:52:33 np0005601977 nova_compute[183130]: 2026-01-30 09:52:33.367 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:34 np0005601977 podman[229568]: 2026-01-30 09:52:34.885133041 +0000 UTC m=+0.106546493 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:52:38 np0005601977 nova_compute[183130]: 2026-01-30 09:52:38.332 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:38 np0005601977 nova_compute[183130]: 2026-01-30 09:52:38.370 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:43 np0005601977 nova_compute[183130]: 2026-01-30 09:52:43.335 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:43 np0005601977 nova_compute[183130]: 2026-01-30 09:52:43.372 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:43 np0005601977 podman[229595]: 2026-01-30 09:52:43.83279594 +0000 UTC m=+0.054791790 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:52:45 np0005601977 nova_compute[183130]: 2026-01-30 09:52:45.358 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:45 np0005601977 nova_compute[183130]: 2026-01-30 09:52:45.358 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:52:45 np0005601977 nova_compute[183130]: 2026-01-30 09:52:45.374 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.082 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.082 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.097 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.171 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.171 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.179 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.179 183134 INFO nova.compute.claims [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.283 183134 DEBUG nova.compute.provider_tree [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.296 183134 DEBUG nova.scheduler.client.report [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.318 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.319 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.364 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.364 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.382 183134 INFO nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.403 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.510 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.512 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.513 183134 INFO nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Creating image(s)#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.514 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.515 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.516 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.541 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.560 183134 DEBUG nova.policy [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.604 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.605 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.606 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.622 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.666 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.667 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.689 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.691 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.691 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.753 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.754 183134 DEBUG nova.virt.disk.api [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.754 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.833 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.835 183134 DEBUG nova.virt.disk.api [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.836 183134 DEBUG nova.objects.instance [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid f8bba2b8-618b-4c55-93d9-4de905bc3554 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.961 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.962 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Ensure instance console log exists: /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.963 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.963 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:46 np0005601977 nova_compute[183130]: 2026-01-30 09:52:46.964 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.373 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.375 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.376 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.376 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.388 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:48 np0005601977 nova_compute[183130]: 2026-01-30 09:52:48.388 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 30 04:52:49 np0005601977 nova_compute[183130]: 2026-01-30 09:52:49.102 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Successfully created port: 5db637a6-7a24-49dc-a9c6-7a733c38f45a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.685 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Successfully updated port: 5db637a6-7a24-49dc-a9c6-7a733c38f45a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.703 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.704 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.704 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.795 183134 DEBUG nova.compute.manager [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.796 183134 DEBUG nova.compute.manager [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing instance network info cache due to event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:52:50 np0005601977 nova_compute[183130]: 2026-01-30 09:52:50.796 183134 DEBUG oslo_concurrency.lockutils [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:52:51 np0005601977 nova_compute[183130]: 2026-01-30 09:52:51.681 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:52:53 np0005601977 nova_compute[183130]: 2026-01-30 09:52:53.389 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:53 np0005601977 nova_compute[183130]: 2026-01-30 09:52:53.391 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:52:53 np0005601977 podman[229637]: 2026-01-30 09:52:53.868645083 +0000 UTC m=+0.075027850 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 30 04:52:53 np0005601977 podman[229636]: 2026-01-30 09:52:53.870126485 +0000 UTC m=+0.080578219 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:52:54 np0005601977 nova_compute[183130]: 2026-01-30 09:52:54.968 183134 DEBUG nova.network.neutron [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updating instance_info_cache with network_info: [{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:54 np0005601977 nova_compute[183130]: 2026-01-30 09:52:54.993 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:54 np0005601977 nova_compute[183130]: 2026-01-30 09:52:54.993 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Instance network_info: |[{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:52:54 np0005601977 nova_compute[183130]: 2026-01-30 09:52:54.994 183134 DEBUG oslo_concurrency.lockutils [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:52:54 np0005601977 nova_compute[183130]: 2026-01-30 09:52:54.994 183134 DEBUG nova.network.neutron [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.000 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Start _get_guest_xml network_info=[{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.005 183134 WARNING nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.019 183134 DEBUG nova.virt.libvirt.host [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.021 183134 DEBUG nova.virt.libvirt.host [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.026 183134 DEBUG nova.virt.libvirt.host [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.027 183134 DEBUG nova.virt.libvirt.host [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.028 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.028 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.029 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.029 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.029 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.030 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.030 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.030 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.030 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.031 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.031 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.031 183134 DEBUG nova.virt.hardware [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.035 183134 DEBUG nova.virt.libvirt.vif [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-33413424',display_name='tempest-TestGettingAddress-server-33413424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-33413424',id=53,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gupr8re3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:52:46Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=f8bba2b8-618b-4c55-93d9-4de905bc3554,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.036 183134 DEBUG nova.network.os_vif_util [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.037 183134 DEBUG nova.network.os_vif_util [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.038 183134 DEBUG nova.objects.instance [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid f8bba2b8-618b-4c55-93d9-4de905bc3554 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.057 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <uuid>f8bba2b8-618b-4c55-93d9-4de905bc3554</uuid>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <name>instance-00000035</name>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-33413424</nova:name>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:52:55</nova:creationTime>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        <nova:port uuid="5db637a6-7a24-49dc-a9c6-7a733c38f45a">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feb6:3281" ipVersion="6"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:feb6:3281" ipVersion="6"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="serial">f8bba2b8-618b-4c55-93d9-4de905bc3554</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="uuid">f8bba2b8-618b-4c55-93d9-4de905bc3554</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.config"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:b6:32:81"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <target dev="tap5db637a6-7a"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/console.log" append="off"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:52:55 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:52:55 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:52:55 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:52:55 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.059 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Preparing to wait for external event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.059 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.059 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.060 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.061 183134 DEBUG nova.virt.libvirt.vif [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-33413424',display_name='tempest-TestGettingAddress-server-33413424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-33413424',id=53,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gupr8re3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:52:46Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=f8bba2b8-618b-4c55-93d9-4de905bc3554,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.061 183134 DEBUG nova.network.os_vif_util [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.063 183134 DEBUG nova.network.os_vif_util [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.063 183134 DEBUG os_vif [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.064 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.065 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.065 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.069 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5db637a6-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.070 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5db637a6-7a, col_values=(('external_ids', {'iface-id': '5db637a6-7a24-49dc-a9c6-7a733c38f45a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:32:81', 'vm-uuid': 'f8bba2b8-618b-4c55-93d9-4de905bc3554'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.072 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:55 np0005601977 NetworkManager[55565]: <info>  [1769766775.0739] manager: (tap5db637a6-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.075 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.080 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.082 183134 INFO os_vif [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a')#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.140 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.141 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.141 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:b6:32:81, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.142 183134 INFO nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Using config drive#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.814 183134 INFO nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Creating config drive at /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.config#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.818 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp250ejotc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.936 183134 DEBUG oslo_concurrency.processutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp250ejotc" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:52:55 np0005601977 kernel: tap5db637a6-7a: entered promiscuous mode
Jan 30 04:52:55 np0005601977 NetworkManager[55565]: <info>  [1769766775.9956] manager: (tap5db637a6-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Jan 30 04:52:55 np0005601977 nova_compute[183130]: 2026-01-30 09:52:55.995 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:55Z|00521|binding|INFO|Claiming lport 5db637a6-7a24-49dc-a9c6-7a733c38f45a for this chassis.
Jan 30 04:52:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:55Z|00522|binding|INFO|5db637a6-7a24-49dc-a9c6-7a733c38f45a: Claiming fa:16:3e:b6:32:81 10.100.0.10 2001:db8:0:1:f816:3eff:feb6:3281 2001:db8::f816:3eff:feb6:3281
Jan 30 04:52:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:56Z|00523|binding|INFO|Setting lport 5db637a6-7a24-49dc-a9c6-7a733c38f45a ovn-installed in OVS
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.003 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:56 np0005601977 ovn_controller[95460]: 2026-01-30T09:52:56Z|00524|binding|INFO|Setting lport 5db637a6-7a24-49dc-a9c6-7a733c38f45a up in Southbound
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.006 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.007 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:32:81 10.100.0.10 2001:db8:0:1:f816:3eff:feb6:3281 2001:db8::f816:3eff:feb6:3281'], port_security=['fa:16:3e:b6:32:81 10.100.0.10 2001:db8:0:1:f816:3eff:feb6:3281 2001:db8::f816:3eff:feb6:3281'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:feb6:3281/64 2001:db8::f816:3eff:feb6:3281/64', 'neutron:device_id': 'f8bba2b8-618b-4c55-93d9-4de905bc3554', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '720a7e6b-119b-41e0-ac76-ea253cb891fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=5db637a6-7a24-49dc-a9c6-7a733c38f45a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.011 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 5db637a6-7a24-49dc-a9c6-7a733c38f45a in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 bound to our chassis#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.013 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0974bb4a-b27d-43c9-b594-a23be3309557#033[00m
Jan 30 04:52:56 np0005601977 systemd-machined[154431]: New machine qemu-42-instance-00000035.
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.027 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2d734153-6f6c-4db2-904d-08edae68f116]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 systemd[1]: Started Virtual Machine qemu-42-instance-00000035.
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.051 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5e49e7-5fdd-4f50-a3f6-bf34e52e474b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.055 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[35e3fca2-7d4e-4c4b-8c51-d7122a1d7d18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 systemd-udevd[229704]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:52:56 np0005601977 NetworkManager[55565]: <info>  [1769766776.0763] device (tap5db637a6-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:52:56 np0005601977 NetworkManager[55565]: <info>  [1769766776.0769] device (tap5db637a6-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.077 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[45526f7d-bb62-4c9c-b060-3422707b102d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.092 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[6eee8eee-6e67-4520-8092-aeae6323981f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0974bb4a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:cf:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 24, 'tx_packets': 5, 'rx_bytes': 2000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 24, 'tx_packets': 5, 'rx_bytes': 2000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527053, 'reachable_time': 22154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 22, 'inoctets': 1608, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 22, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1608, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 22, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229711, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.102 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[482b2c5e-ae3b-43ad-8e2c-1417f82ec045]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0974bb4a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527063, 'tstamp': 527063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229713, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0974bb4a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527065, 'tstamp': 527065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229713, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.104 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0974bb4a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.105 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.106 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.107 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0974bb4a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.108 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.108 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0974bb4a-b0, col_values=(('external_ids', {'iface-id': 'c7609c2f-a62c-4f5c-ae1f-79d31dc0530f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:56 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:56.108 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.270 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766776.2702363, f8bba2b8-618b-4c55-93d9-4de905bc3554 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.271 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] VM Started (Lifecycle Event)#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.294 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.298 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766776.2719753, f8bba2b8-618b-4c55-93d9-4de905bc3554 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.298 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.319 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.322 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.342 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.890 183134 DEBUG nova.compute.manager [req-584256a4-f11e-47d2-b29e-4906aba8d238 req-a5aa1225-4650-4a8a-82b7-946f91a688fe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.891 183134 DEBUG oslo_concurrency.lockutils [req-584256a4-f11e-47d2-b29e-4906aba8d238 req-a5aa1225-4650-4a8a-82b7-946f91a688fe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.891 183134 DEBUG oslo_concurrency.lockutils [req-584256a4-f11e-47d2-b29e-4906aba8d238 req-a5aa1225-4650-4a8a-82b7-946f91a688fe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.891 183134 DEBUG oslo_concurrency.lockutils [req-584256a4-f11e-47d2-b29e-4906aba8d238 req-a5aa1225-4650-4a8a-82b7-946f91a688fe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.891 183134 DEBUG nova.compute.manager [req-584256a4-f11e-47d2-b29e-4906aba8d238 req-a5aa1225-4650-4a8a-82b7-946f91a688fe dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Processing event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.891 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.895 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766776.8957577, f8bba2b8-618b-4c55-93d9-4de905bc3554 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.896 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.897 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.900 183134 INFO nova.virt.libvirt.driver [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Instance spawned successfully.#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.901 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.918 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.923 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.928 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.928 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.929 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.929 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.929 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.930 183134 DEBUG nova.virt.libvirt.driver [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.958 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.994 183134 INFO nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Took 10.48 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:52:56 np0005601977 nova_compute[183130]: 2026-01-30 09:52:56.994 183134 DEBUG nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:52:57 np0005601977 nova_compute[183130]: 2026-01-30 09:52:57.092 183134 INFO nova.compute.manager [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Took 10.95 seconds to build instance.#033[00m
Jan 30 04:52:57 np0005601977 nova_compute[183130]: 2026-01-30 09:52:57.112 183134 DEBUG oslo_concurrency.lockutils [None req-183a933d-6e70-4855-9763-f9c7cbd8e616 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:57.408 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:57.408 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:52:57.409 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:58 np0005601977 nova_compute[183130]: 2026-01-30 09:52:58.034 183134 DEBUG nova.network.neutron [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updated VIF entry in instance network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:52:58 np0005601977 nova_compute[183130]: 2026-01-30 09:52:58.035 183134 DEBUG nova.network.neutron [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updating instance_info_cache with network_info: [{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:52:58 np0005601977 nova_compute[183130]: 2026-01-30 09:52:58.050 183134 DEBUG oslo_concurrency.lockutils [req-bdb87fe4-98fc-469b-b39b-d22f725bab33 req-4023dee3-adae-44bb-8216-545ab9067624 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:52:58 np0005601977 nova_compute[183130]: 2026-01-30 09:52:58.415 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:52:58 np0005601977 podman[229721]: 2026-01-30 09:52:58.873132904 +0000 UTC m=+0.085574411 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:52:58 np0005601977 podman[229722]: 2026-01-30 09:52:58.873029321 +0000 UTC m=+0.085951521 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.006 183134 DEBUG nova.compute.manager [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.006 183134 DEBUG oslo_concurrency.lockutils [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.006 183134 DEBUG oslo_concurrency.lockutils [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.006 183134 DEBUG oslo_concurrency.lockutils [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.007 183134 DEBUG nova.compute.manager [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] No waiting events found dispatching network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:52:59 np0005601977 nova_compute[183130]: 2026-01-30 09:52:59.007 183134 WARNING nova.compute.manager [req-39640b6b-3202-4eda-879e-69875f2b36cb req-32e60899-b3c0-4245-a9b3-4cdd7fcac362 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received unexpected event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a for instance with vm_state active and task_state None.#033[00m
Jan 30 04:53:00 np0005601977 nova_compute[183130]: 2026-01-30 09:53:00.072 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:02 np0005601977 nova_compute[183130]: 2026-01-30 09:53:02.197 183134 DEBUG nova.compute.manager [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:02 np0005601977 nova_compute[183130]: 2026-01-30 09:53:02.198 183134 DEBUG nova.compute.manager [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing instance network info cache due to event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:53:02 np0005601977 nova_compute[183130]: 2026-01-30 09:53:02.198 183134 DEBUG oslo_concurrency.lockutils [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:53:02 np0005601977 nova_compute[183130]: 2026-01-30 09:53:02.199 183134 DEBUG oslo_concurrency.lockutils [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:53:02 np0005601977 nova_compute[183130]: 2026-01-30 09:53:02.199 183134 DEBUG nova.network.neutron [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:53:03 np0005601977 nova_compute[183130]: 2026-01-30 09:53:03.416 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:03 np0005601977 nova_compute[183130]: 2026-01-30 09:53:03.517 183134 DEBUG nova.network.neutron [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updated VIF entry in instance network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:53:03 np0005601977 nova_compute[183130]: 2026-01-30 09:53:03.518 183134 DEBUG nova.network.neutron [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updating instance_info_cache with network_info: [{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:53:03 np0005601977 nova_compute[183130]: 2026-01-30 09:53:03.539 183134 DEBUG oslo_concurrency.lockutils [req-687724ee-a8c9-416c-a56c-d28a630065b0 req-dfbb71fa-93fd-490b-bcc8-891136f5f75f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:53:05 np0005601977 nova_compute[183130]: 2026-01-30 09:53:05.074 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:05 np0005601977 podman[229764]: 2026-01-30 09:53:05.874232364 +0000 UTC m=+0.080657816 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:53:08 np0005601977 nova_compute[183130]: 2026-01-30 09:53:08.417 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:09Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b6:32:81 10.100.0.10
Jan 30 04:53:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:09Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:32:81 10.100.0.10
Jan 30 04:53:10 np0005601977 nova_compute[183130]: 2026-01-30 09:53:10.077 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:13 np0005601977 nova_compute[183130]: 2026-01-30 09:53:13.462 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:14 np0005601977 podman[229803]: 2026-01-30 09:53:14.850025781 +0000 UTC m=+0.061716875 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:53:15 np0005601977 nova_compute[183130]: 2026-01-30 09:53:15.079 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:18 np0005601977 nova_compute[183130]: 2026-01-30 09:53:18.464 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.359 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.430 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.431 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.431 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.431 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.519 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.598 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.599 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.643 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.650 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.694 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.695 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.769 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.905 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.907 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5373MB free_disk=73.18389129638672GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.907 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:19 np0005601977 nova_compute[183130]: 2026-01-30 09:53:19.907 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.054 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.055 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance f8bba2b8-618b-4c55-93d9-4de905bc3554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.055 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.056 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.067 183134 DEBUG nova.compute.manager [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.068 183134 DEBUG nova.compute.manager [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing instance network info cache due to event network-changed-5db637a6-7a24-49dc-a9c6-7a733c38f45a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.068 183134 DEBUG oslo_concurrency.lockutils [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.069 183134 DEBUG oslo_concurrency.lockutils [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.069 183134 DEBUG nova.network.neutron [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Refreshing network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.081 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.135 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing inventories for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.148 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.148 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.149 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.149 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.149 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.150 183134 INFO nova.compute.manager [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Terminating instance#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.151 183134 DEBUG nova.compute.manager [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:53:20 np0005601977 kernel: tap5db637a6-7a (unregistering): left promiscuous mode
Jan 30 04:53:20 np0005601977 NetworkManager[55565]: <info>  [1769766800.1893] device (tap5db637a6-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.229 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating ProviderTree inventory for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.230 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Updating inventory in ProviderTree for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:53:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:20Z|00525|binding|INFO|Releasing lport 5db637a6-7a24-49dc-a9c6-7a733c38f45a from this chassis (sb_readonly=0)
Jan 30 04:53:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:20Z|00526|binding|INFO|Setting lport 5db637a6-7a24-49dc-a9c6-7a733c38f45a down in Southbound
Jan 30 04:53:20 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:20Z|00527|binding|INFO|Removing iface tap5db637a6-7a ovn-installed in OVS
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.233 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.238 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.239 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:32:81 10.100.0.10 2001:db8:0:1:f816:3eff:feb6:3281 2001:db8::f816:3eff:feb6:3281'], port_security=['fa:16:3e:b6:32:81 10.100.0.10 2001:db8:0:1:f816:3eff:feb6:3281 2001:db8::f816:3eff:feb6:3281'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:feb6:3281/64 2001:db8::f816:3eff:feb6:3281/64', 'neutron:device_id': 'f8bba2b8-618b-4c55-93d9-4de905bc3554', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '720a7e6b-119b-41e0-ac76-ea253cb891fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=5db637a6-7a24-49dc-a9c6-7a733c38f45a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.242 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 5db637a6-7a24-49dc-a9c6-7a733c38f45a in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 unbound from our chassis#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.244 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0974bb4a-b27d-43c9-b594-a23be3309557#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.257 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing aggregate associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.261 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[95c8cc13-fb89-4e1e-9202-d32a4dd1ecfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 30 04:53:20 np0005601977 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000035.scope: Consumed 12.106s CPU time.
Jan 30 04:53:20 np0005601977 systemd-machined[154431]: Machine qemu-42-instance-00000035 terminated.
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.283 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Refreshing trait associations for resource provider eb11f67d-14b4-46ee-89fd-92936c45ed58, traits: HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.289 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ede4fb15-04ec-4038-8740-feb61cf01f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.292 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a80f8d-ad6e-4eab-92c5-bbb5f2f2d92b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.311 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c07610-4554-4b28-804a-1be5fbf5b9ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.325 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb0b431-0d52-483a-9ef2-82d956842761]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0974bb4a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:cf:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 3468, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 3468, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527053, 'reachable_time': 22154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 38, 'inoctets': 2768, 'indelivers': 13, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 38, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2768, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 38, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 13, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229856, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.338 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c1428e50-9a7e-410e-9a1a-967f619806f6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0974bb4a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527063, 'tstamp': 527063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229857, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0974bb4a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527065, 'tstamp': 527065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229857, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.339 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0974bb4a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.341 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.345 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.345 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0974bb4a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.345 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.346 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0974bb4a-b0, col_values=(('external_ids', {'iface-id': 'c7609c2f-a62c-4f5c-ae1f-79d31dc0530f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:20 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:20.346 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.352 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.370 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.413 183134 INFO nova.virt.libvirt.driver [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Instance destroyed successfully.#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.414 183134 DEBUG nova.objects.instance [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid f8bba2b8-618b-4c55-93d9-4de905bc3554 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.416 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.416 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.444 183134 DEBUG nova.virt.libvirt.vif [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-33413424',display_name='tempest-TestGettingAddress-server-33413424',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-33413424',id=53,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:52:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gupr8re3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:52:57Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=f8bba2b8-618b-4c55-93d9-4de905bc3554,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.444 183134 DEBUG nova.network.os_vif_util [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.445 183134 DEBUG nova.network.os_vif_util [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.446 183134 DEBUG os_vif [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.447 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.447 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5db637a6-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.449 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.451 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.454 183134 INFO os_vif [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:32:81,bridge_name='br-int',has_traffic_filtering=True,id=5db637a6-7a24-49dc-a9c6-7a733c38f45a,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db637a6-7a')#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.454 183134 INFO nova.virt.libvirt.driver [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Deleting instance files /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554_del#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.454 183134 INFO nova.virt.libvirt.driver [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Deletion of /var/lib/nova/instances/f8bba2b8-618b-4c55-93d9-4de905bc3554_del complete#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.517 183134 INFO nova.compute.manager [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.518 183134 DEBUG oslo.service.loopingcall [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.518 183134 DEBUG nova.compute.manager [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.519 183134 DEBUG nova.network.neutron [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.527 183134 DEBUG nova.compute.manager [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-unplugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.528 183134 DEBUG oslo_concurrency.lockutils [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.528 183134 DEBUG oslo_concurrency.lockutils [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.529 183134 DEBUG oslo_concurrency.lockutils [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.529 183134 DEBUG nova.compute.manager [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] No waiting events found dispatching network-vif-unplugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:53:20 np0005601977 nova_compute[183130]: 2026-01-30 09:53:20.530 183134 DEBUG nova.compute.manager [req-b922848a-3f8f-4c5f-bb48-4d987798f6a7 req-8283e701-0578-4b33-90d1-5cd92568fec9 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-unplugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.373 183134 DEBUG nova.network.neutron [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.444 183134 INFO nova.compute.manager [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Took 0.93 seconds to deallocate network for instance.#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.510 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.514 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.618 183134 DEBUG nova.compute.provider_tree [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.634 183134 DEBUG nova.scheduler.client.report [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.656 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.687 183134 INFO nova.scheduler.client.report [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance f8bba2b8-618b-4c55-93d9-4de905bc3554#033[00m
Jan 30 04:53:21 np0005601977 nova_compute[183130]: 2026-01-30 09:53:21.742 183134 DEBUG oslo_concurrency.lockutils [None req-68715be2-063d-4b64-b58c-b4bc750559c4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.188 183134 DEBUG nova.compute.manager [req-d83d1c26-e411-4fd6-b489-8baa962b132d req-3f8f5eca-8273-4f84-96d4-588684f3730a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-deleted-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.372 183134 DEBUG nova.network.neutron [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updated VIF entry in instance network info cache for port 5db637a6-7a24-49dc-a9c6-7a733c38f45a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.373 183134 DEBUG nova.network.neutron [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Updating instance_info_cache with network_info: [{"id": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "address": "fa:16:3e:b6:32:81", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb6:3281", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db637a6-7a", "ovs_interfaceid": "5db637a6-7a24-49dc-a9c6-7a733c38f45a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.395 183134 DEBUG oslo_concurrency.lockutils [req-ac873bd7-cc1f-46ee-9244-c2d8e3ac801f req-78d9cdb6-56e1-47f0-85f3-953877b7f0ce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-f8bba2b8-618b-4c55-93d9-4de905bc3554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.503 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.504 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.504 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.505 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.505 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.506 183134 INFO nova.compute.manager [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Terminating instance#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.507 183134 DEBUG nova.compute.manager [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:53:22 np0005601977 kernel: tap7c0dcbc0-e8 (unregistering): left promiscuous mode
Jan 30 04:53:22 np0005601977 NetworkManager[55565]: <info>  [1769766802.5273] device (tap7c0dcbc0-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:53:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:22Z|00528|binding|INFO|Releasing lport 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 from this chassis (sb_readonly=0)
Jan 30 04:53:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:22Z|00529|binding|INFO|Setting lport 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 down in Southbound
Jan 30 04:53:22 np0005601977 ovn_controller[95460]: 2026-01-30T09:53:22Z|00530|binding|INFO|Removing iface tap7c0dcbc0-e8 ovn-installed in OVS
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.531 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.533 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.539 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:6a:47 10.100.0.9 2001:db8:0:1:f816:3eff:fe05:6a47 2001:db8::f816:3eff:fe05:6a47'], port_security=['fa:16:3e:05:6a:47 10.100.0.9 2001:db8:0:1:f816:3eff:fe05:6a47 2001:db8::f816:3eff:fe05:6a47'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe05:6a47/64 2001:db8::f816:3eff:fe05:6a47/64', 'neutron:device_id': 'b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0974bb4a-b27d-43c9-b594-a23be3309557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '720a7e6b-119b-41e0-ac76-ea253cb891fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ed9f11f-a0b3-4864-9831-309b1f69376a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=7c0dcbc0-e8be-4d24-9ace-95ca83739c94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.540 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 in datapath 0974bb4a-b27d-43c9-b594-a23be3309557 unbound from our chassis#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.541 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0974bb4a-b27d-43c9-b594-a23be3309557, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.542 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.543 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[353e8cdf-00f9-4f8d-a9d4-ad804fdd8d0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.544 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557 namespace which is not needed anymore#033[00m
Jan 30 04:53:22 np0005601977 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 30 04:53:22 np0005601977 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000034.scope: Consumed 15.252s CPU time.
Jan 30 04:53:22 np0005601977 systemd-machined[154431]: Machine qemu-41-instance-00000034 terminated.
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.612 183134 DEBUG nova.compute.manager [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.613 183134 DEBUG oslo_concurrency.lockutils [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.613 183134 DEBUG oslo_concurrency.lockutils [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.614 183134 DEBUG oslo_concurrency.lockutils [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "f8bba2b8-618b-4c55-93d9-4de905bc3554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.614 183134 DEBUG nova.compute.manager [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] No waiting events found dispatching network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.614 183134 WARNING nova.compute.manager [req-9446e80c-9850-45e9-abcd-bc1f3fe60f2b req-6711d4c3-180a-47b6-9a1a-87cc168fe4a2 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Received unexpected event network-vif-plugged-5db637a6-7a24-49dc-a9c6-7a733c38f45a for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [NOTICE]   (229447) : haproxy version is 2.8.14-c23fe91
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [NOTICE]   (229447) : path to executable is /usr/sbin/haproxy
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [WARNING]  (229447) : Exiting Master process...
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [WARNING]  (229447) : Exiting Master process...
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [ALERT]    (229447) : Current worker (229449) exited with code 143 (Terminated)
Jan 30 04:53:22 np0005601977 neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557[229443]: [WARNING]  (229447) : All workers exited. Exiting... (0)
Jan 30 04:53:22 np0005601977 systemd[1]: libpod-1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7.scope: Deactivated successfully.
Jan 30 04:53:22 np0005601977 podman[229899]: 2026-01-30 09:53:22.682465335 +0000 UTC m=+0.054351815 container died 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:53:22 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7-userdata-shm.mount: Deactivated successfully.
Jan 30 04:53:22 np0005601977 systemd[1]: var-lib-containers-storage-overlay-2339da5761650c8d6a23365ccbc290a72cd73a127597099081e0b65a2753daa4-merged.mount: Deactivated successfully.
Jan 30 04:53:22 np0005601977 podman[229899]: 2026-01-30 09:53:22.719779011 +0000 UTC m=+0.091665471 container cleanup 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:53:22 np0005601977 NetworkManager[55565]: <info>  [1769766802.7228] manager: (tap7c0dcbc0-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Jan 30 04:53:22 np0005601977 systemd[1]: libpod-conmon-1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7.scope: Deactivated successfully.
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.761 183134 INFO nova.virt.libvirt.driver [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance destroyed successfully.#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.763 183134 DEBUG nova.objects.instance [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.779 183134 DEBUG nova.virt.libvirt.vif [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:52:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1649993017',display_name='tempest-TestGettingAddress-server-1649993017',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1649993017',id=52,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK00D8vecAOmC+75D7EixtAKdLQu9WukSVS+bFDzxBD4lZ5qARxqvenS++neSnkrPQmUB2Bbt5//1XqfaHhAnE3u5mJ7J9hfok68eZ6IhkUY/HmJr7e+w7zzvloOYyUpWg==',key_name='tempest-TestGettingAddress-283787783',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:52:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gpyx1avo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:52:20Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.780 183134 DEBUG nova.network.os_vif_util [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "address": "fa:16:3e:05:6a:47", "network": {"id": "0974bb4a-b27d-43c9-b594-a23be3309557", "bridge": "br-int", "label": "tempest-network-smoke--86792552", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe05:6a47", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c0dcbc0-e8", "ovs_interfaceid": "7c0dcbc0-e8be-4d24-9ace-95ca83739c94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.781 183134 DEBUG nova.network.os_vif_util [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.782 183134 DEBUG os_vif [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:53:22 np0005601977 podman[229933]: 2026-01-30 09:53:22.782921876 +0000 UTC m=+0.042346911 container remove 1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.784 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.785 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c0dcbc0-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.787 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7b8cc5-69ae-4b0f-b9d0-cc89e0ea5d04]: (4, ('Fri Jan 30 09:53:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557 (1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7)\n1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7\nFri Jan 30 09:53:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557 (1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7)\n1b727dd8e8f1efb83efbb0e01294aad9ef4395236c15fbf06e28b6e3db97fce7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.787 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.789 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[56bd0ad8-567b-4477-af27-ff8c6a370e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.790 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0974bb4a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.790 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.791 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 kernel: tap0974bb4a-b0: left promiscuous mode
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.793 183134 INFO os_vif [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:6a:47,bridge_name='br-int',has_traffic_filtering=True,id=7c0dcbc0-e8be-4d24-9ace-95ca83739c94,network=Network(0974bb4a-b27d-43c9-b594-a23be3309557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c0dcbc0-e8')#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.794 183134 INFO nova.virt.libvirt.driver [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Deleting instance files /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4_del#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.794 183134 INFO nova.virt.libvirt.driver [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Deletion of /var/lib/nova/instances/b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4_del complete#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.798 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.799 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[952a48ce-7927-4333-b566-fb7b9b5bfe68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.814 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a8335ca4-c5b0-4041-a338-064d699eda83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.815 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed103cc-1b18-4975-bfdb-750d00e782ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.824 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4d351fc2-fc5b-4cf3-9a6e-29333bd82d43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527047, 'reachable_time': 36842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229958, 'error': None, 'target': 'ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 systemd[1]: run-netns-ovnmeta\x2d0974bb4a\x2db27d\x2d43c9\x2db594\x2da23be3309557.mount: Deactivated successfully.
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.829 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0974bb4a-b27d-43c9-b594-a23be3309557 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:53:22 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:22.829 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[47068b58-c707-48d6-a349-33399c88b67d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.849 183134 INFO nova.compute.manager [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.850 183134 DEBUG oslo.service.loopingcall [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.851 183134 DEBUG nova.compute.manager [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:53:22 np0005601977 nova_compute[183130]: 2026-01-30 09:53:22.851 183134 DEBUG nova.network.neutron [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.330 183134 DEBUG nova.network.neutron [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.348 183134 INFO nova.compute.manager [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Took 0.50 seconds to deallocate network for instance.#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.390 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.391 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.439 183134 DEBUG nova.compute.provider_tree [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.459 183134 DEBUG nova.scheduler.client.report [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.466 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.486 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.518 183134 INFO nova.scheduler.client.report [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4#033[00m
Jan 30 04:53:23 np0005601977 nova_compute[183130]: 2026-01-30 09:53:23.587 183134 DEBUG oslo_concurrency.lockutils [None req-fc68e082-7851-42e5-b342-48b3dc983f1e 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.268 183134 DEBUG nova.compute.manager [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.269 183134 DEBUG nova.compute.manager [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing instance network info cache due to event network-changed-7c0dcbc0-e8be-4d24-9ace-95ca83739c94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.269 183134 DEBUG oslo_concurrency.lockutils [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.270 183134 DEBUG oslo_concurrency.lockutils [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.270 183134 DEBUG nova.network.neutron [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Refreshing network info cache for port 7c0dcbc0-e8be-4d24-9ace-95ca83739c94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.682 183134 DEBUG nova.network.neutron [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.724 183134 DEBUG nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-vif-unplugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.724 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.724 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.725 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.725 183134 DEBUG nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] No waiting events found dispatching network-vif-unplugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.725 183134 WARNING nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received unexpected event network-vif-unplugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.726 183134 DEBUG nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.726 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.726 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.727 183134 DEBUG oslo_concurrency.lockutils [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.727 183134 DEBUG nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] No waiting events found dispatching network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:53:24 np0005601977 nova_compute[183130]: 2026-01-30 09:53:24.727 183134 WARNING nova.compute.manager [req-61aa4eb5-6213-45d1-8ebd-b2494a87dd93 req-e0b87198-552e-4c85-a8de-b9fa502bb9ca dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received unexpected event network-vif-plugged-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:53:24 np0005601977 podman[229960]: 2026-01-30 09:53:24.842966011 +0000 UTC m=+0.052363448 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:53:24 np0005601977 podman[229959]: 2026-01-30 09:53:24.852127743 +0000 UTC m=+0.063664831 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Jan 30 04:53:25 np0005601977 nova_compute[183130]: 2026-01-30 09:53:25.049 183134 DEBUG nova.network.neutron [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 30 04:53:25 np0005601977 nova_compute[183130]: 2026-01-30 09:53:25.050 183134 DEBUG oslo_concurrency.lockutils [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:53:25 np0005601977 nova_compute[183130]: 2026-01-30 09:53:25.051 183134 DEBUG nova.compute.manager [req-01363098-180c-43a4-a88f-4af312d35589 req-2e933df6-f49c-4e04-bd11-da62f325c7b8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Received event network-vif-deleted-7c0dcbc0-e8be-4d24-9ace-95ca83739c94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:25 np0005601977 nova_compute[183130]: 2026-01-30 09:53:25.399 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:25 np0005601977 nova_compute[183130]: 2026-01-30 09:53:25.400 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:26 np0005601977 nova_compute[183130]: 2026-01-30 09:53:26.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:26 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:26.977 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:53:26 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:26.978 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:53:26 np0005601977 nova_compute[183130]: 2026-01-30 09:53:26.977 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:27 np0005601977 nova_compute[183130]: 2026-01-30 09:53:27.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:27 np0005601977 nova_compute[183130]: 2026-01-30 09:53:27.788 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:28 np0005601977 nova_compute[183130]: 2026-01-30 09:53:28.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:28 np0005601977 nova_compute[183130]: 2026-01-30 09:53:28.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:28 np0005601977 nova_compute[183130]: 2026-01-30 09:53:28.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:53:28 np0005601977 nova_compute[183130]: 2026-01-30 09:53:28.469 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:29 np0005601977 podman[230002]: 2026-01-30 09:53:29.84431064 +0000 UTC m=+0.055444877 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:53:29 np0005601977 podman[230003]: 2026-01-30 09:53:29.844522086 +0000 UTC m=+0.056786355 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:53:31 np0005601977 nova_compute[183130]: 2026-01-30 09:53:31.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:31 np0005601977 nova_compute[183130]: 2026-01-30 09:53:31.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:53:31 np0005601977 nova_compute[183130]: 2026-01-30 09:53:31.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:53:31 np0005601977 nova_compute[183130]: 2026-01-30 09:53:31.365 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:53:32 np0005601977 nova_compute[183130]: 2026-01-30 09:53:32.790 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:32.979 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:33 np0005601977 nova_compute[183130]: 2026-01-30 09:53:33.470 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:33 np0005601977 nova_compute[183130]: 2026-01-30 09:53:33.755 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:34 np0005601977 nova_compute[183130]: 2026-01-30 09:53:34.362 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:35 np0005601977 nova_compute[183130]: 2026-01-30 09:53:35.338 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:53:35 np0005601977 nova_compute[183130]: 2026-01-30 09:53:35.411 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766800.4098973, f8bba2b8-618b-4c55-93d9-4de905bc3554 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:53:35 np0005601977 nova_compute[183130]: 2026-01-30 09:53:35.412 183134 INFO nova.compute.manager [-] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:53:35 np0005601977 nova_compute[183130]: 2026-01-30 09:53:35.448 183134 DEBUG nova.compute.manager [None req-fe04cc1f-99ba-4153-89c5-e04325800e06 - - - - - -] [instance: f8bba2b8-618b-4c55-93d9-4de905bc3554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:53:36 np0005601977 podman[230045]: 2026-01-30 09:53:36.85442946 +0000 UTC m=+0.076628612 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:53:37 np0005601977 nova_compute[183130]: 2026-01-30 09:53:37.761 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766802.7598066, b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:53:37 np0005601977 nova_compute[183130]: 2026-01-30 09:53:37.761 183134 INFO nova.compute.manager [-] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:53:37 np0005601977 nova_compute[183130]: 2026-01-30 09:53:37.783 183134 DEBUG nova.compute.manager [None req-f10c22f2-28fa-4e3c-a465-6cd4f93aea0a - - - - - -] [instance: b0e8e4c6-021c-4d42-9f3d-7d0cb830cbf4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:53:37 np0005601977 nova_compute[183130]: 2026-01-30 09:53:37.792 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:38 np0005601977 nova_compute[183130]: 2026-01-30 09:53:38.508 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:42 np0005601977 nova_compute[183130]: 2026-01-30 09:53:42.794 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:43 np0005601977 nova_compute[183130]: 2026-01-30 09:53:43.575 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:45 np0005601977 podman[230070]: 2026-01-30 09:53:45.838404189 +0000 UTC m=+0.053124069 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:53:47 np0005601977 nova_compute[183130]: 2026-01-30 09:53:47.795 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:47.901 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:4d:5a 10.100.0.2 2001:db8::f816:3eff:fe34:4d5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe34:4d5a/64', 'neutron:device_id': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb) old=Port_Binding(mac=['fa:16:3e:34:4d:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:53:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:47.903 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba updated#033[00m
Jan 30 04:53:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:47.903 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e49a07cb-1da2-4f34-9999-d9ea635349ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:53:47 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:47.904 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4470e8-a158-41e9-a129-6bc1edb4d6b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:48 np0005601977 nova_compute[183130]: 2026-01-30 09:53:48.578 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:51.994 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:4d:5a 10.100.0.2 2001:db8:0:1:f816:3eff:fe34:4d5a 2001:db8::f816:3eff:fe34:4d5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe34:4d5a/64 2001:db8::f816:3eff:fe34:4d5a/64', 'neutron:device_id': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb) old=Port_Binding(mac=['fa:16:3e:34:4d:5a 10.100.0.2 2001:db8::f816:3eff:fe34:4d5a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe34:4d5a/64', 'neutron:device_id': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:53:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:51.995 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba updated#033[00m
Jan 30 04:53:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:51.996 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e49a07cb-1da2-4f34-9999-d9ea635349ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:53:51 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:51.996 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fb84c383-a2be-41f9-9f4d-2b2d776a351a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:53:52 np0005601977 nova_compute[183130]: 2026-01-30 09:53:52.797 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:53 np0005601977 nova_compute[183130]: 2026-01-30 09:53:53.580 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.459 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.459 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.459 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.459 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.459 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.460 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:53:55.460 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601977 podman[230096]: 2026-01-30 09:53:55.823449926 +0000 UTC m=+0.044968716 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:53:55 np0005601977 podman[230095]: 2026-01-30 09:53:55.836282533 +0000 UTC m=+0.057696690 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.7, maintainer=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.873 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.874 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.891 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.965 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.966 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.972 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:53:55 np0005601977 nova_compute[183130]: 2026-01-30 09:53:55.973 183134 INFO nova.compute.claims [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.085 183134 DEBUG nova.compute.provider_tree [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.104 183134 DEBUG nova.scheduler.client.report [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.123 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.124 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.175 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.176 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.194 183134 INFO nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.211 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.304 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.305 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.306 183134 INFO nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Creating image(s)#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.306 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.306 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.307 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.319 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.364 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.365 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.366 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.378 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.432 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.433 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.467 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.467 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.468 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.513 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.514 183134 DEBUG nova.virt.disk.api [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.515 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.555 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.556 183134 DEBUG nova.virt.disk.api [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.556 183134 DEBUG nova.objects.instance [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.571 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.571 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Ensure instance console log exists: /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.571 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.572 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.572 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:56 np0005601977 nova_compute[183130]: 2026-01-30 09:53:56.711 183134 DEBUG nova.policy [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:53:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:57.408 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:57.409 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:53:57.409 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:53:57 np0005601977 nova_compute[183130]: 2026-01-30 09:53:57.665 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Successfully created port: 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:53:57 np0005601977 nova_compute[183130]: 2026-01-30 09:53:57.799 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:58 np0005601977 nova_compute[183130]: 2026-01-30 09:53:58.610 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:53:58 np0005601977 nova_compute[183130]: 2026-01-30 09:53:58.969 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Successfully updated port: 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:53:58 np0005601977 nova_compute[183130]: 2026-01-30 09:53:58.986 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:53:58 np0005601977 nova_compute[183130]: 2026-01-30 09:53:58.986 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:53:58 np0005601977 nova_compute[183130]: 2026-01-30 09:53:58.987 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:53:59 np0005601977 nova_compute[183130]: 2026-01-30 09:53:59.067 183134 DEBUG nova.compute.manager [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:53:59 np0005601977 nova_compute[183130]: 2026-01-30 09:53:59.068 183134 DEBUG nova.compute.manager [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing instance network info cache due to event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:53:59 np0005601977 nova_compute[183130]: 2026-01-30 09:53:59.068 183134 DEBUG oslo_concurrency.lockutils [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:53:59 np0005601977 nova_compute[183130]: 2026-01-30 09:53:59.131 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.700 183134 DEBUG nova.network.neutron [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.718 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.719 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Instance network_info: |[{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.720 183134 DEBUG oslo_concurrency.lockutils [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.720 183134 DEBUG nova.network.neutron [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.726 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Start _get_guest_xml network_info=[{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.734 183134 WARNING nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.743 183134 DEBUG nova.virt.libvirt.host [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.744 183134 DEBUG nova.virt.libvirt.host [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.748 183134 DEBUG nova.virt.libvirt.host [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.749 183134 DEBUG nova.virt.libvirt.host [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.751 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.751 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.752 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.752 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.753 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.753 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.753 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.754 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.754 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.754 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.755 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.755 183134 DEBUG nova.virt.hardware [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.761 183134 DEBUG nova.virt.libvirt.vif [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1787753693',display_name='tempest-TestGettingAddress-server-1787753693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1787753693',id=54,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gijkf33n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:53:56Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=93393ec0-3300-4ba6-a539-ffaaa32ffdc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.762 183134 DEBUG nova.network.os_vif_util [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.763 183134 DEBUG nova.network.os_vif_util [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.765 183134 DEBUG nova.objects.instance [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.778 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <uuid>93393ec0-3300-4ba6-a539-ffaaa32ffdc2</uuid>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <name>instance-00000036</name>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-1787753693</nova:name>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:54:00</nova:creationTime>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        <nova:port uuid="12c93e5f-3f77-4417-a6c0-a9ddd740d0f9">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fece:d35f" ipVersion="6"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fece:d35f" ipVersion="6"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="serial">93393ec0-3300-4ba6-a539-ffaaa32ffdc2</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="uuid">93393ec0-3300-4ba6-a539-ffaaa32ffdc2</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.config"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:ce:d3:5f"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <target dev="tap12c93e5f-3f"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/console.log" append="off"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:54:00 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:54:00 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:54:00 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:54:00 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.779 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Preparing to wait for external event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.779 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.779 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.779 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.780 183134 DEBUG nova.virt.libvirt.vif [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1787753693',display_name='tempest-TestGettingAddress-server-1787753693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1787753693',id=54,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gijkf33n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:53:56Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=93393ec0-3300-4ba6-a539-ffaaa32ffdc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.780 183134 DEBUG nova.network.os_vif_util [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.781 183134 DEBUG nova.network.os_vif_util [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.781 183134 DEBUG os_vif [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.781 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.782 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.782 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.785 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.785 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12c93e5f-3f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.786 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12c93e5f-3f, col_values=(('external_ids', {'iface-id': '12c93e5f-3f77-4417-a6c0-a9ddd740d0f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:d3:5f', 'vm-uuid': '93393ec0-3300-4ba6-a539-ffaaa32ffdc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.787 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:00 np0005601977 NetworkManager[55565]: <info>  [1769766840.7895] manager: (tap12c93e5f-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.790 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.795 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.796 183134 INFO os_vif [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f')#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.847 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:54:00 np0005601977 podman[230151]: 2026-01-30 09:54:00.847335081 +0000 UTC m=+0.055312372 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.847 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.847 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:ce:d3:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:54:00 np0005601977 nova_compute[183130]: 2026-01-30 09:54:00.847 183134 INFO nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Using config drive#033[00m
Jan 30 04:54:00 np0005601977 podman[230152]: 2026-01-30 09:54:00.85604894 +0000 UTC m=+0.064928466 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.112 183134 INFO nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Creating config drive at /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.config#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.117 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph5nw0l3k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.238 183134 DEBUG oslo_concurrency.processutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph5nw0l3k" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.3032] manager: (tap12c93e5f-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 30 04:54:01 np0005601977 kernel: tap12c93e5f-3f: entered promiscuous mode
Jan 30 04:54:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:01Z|00531|binding|INFO|Claiming lport 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 for this chassis.
Jan 30 04:54:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:01Z|00532|binding|INFO|12c93e5f-3f77-4417-a6c0-a9ddd740d0f9: Claiming fa:16:3e:ce:d3:5f 10.100.0.11 2001:db8:0:1:f816:3eff:fece:d35f 2001:db8::f816:3eff:fece:d35f
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.306 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.308 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.324 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:d3:5f 10.100.0.11 2001:db8:0:1:f816:3eff:fece:d35f 2001:db8::f816:3eff:fece:d35f'], port_security=['fa:16:3e:ce:d3:5f 10.100.0.11 2001:db8:0:1:f816:3eff:fece:d35f 2001:db8::f816:3eff:fece:d35f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fece:d35f/64 2001:db8::f816:3eff:fece:d35f/64', 'neutron:device_id': '93393ec0-3300-4ba6-a539-ffaaa32ffdc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.325 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba bound to our chassis#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.326 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e49a07cb-1da2-4f34-9999-d9ea635349ba#033[00m
Jan 30 04:54:01 np0005601977 systemd-udevd[230213]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.333 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.335 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5a828b-7e49-4616-9d40-8c051a9c5c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.336 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape49a07cb-11 in ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:54:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:01Z|00533|binding|INFO|Setting lport 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 ovn-installed in OVS
Jan 30 04:54:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:01Z|00534|binding|INFO|Setting lport 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 up in Southbound
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.339 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.339 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape49a07cb-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.339 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[87c68ee1-b2fb-4a84-9014-ebe6c2e61604]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 systemd-machined[154431]: New machine qemu-43-instance-00000036.
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.341 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[91e1eb83-1c3a-4deb-8f0c-902380fe28f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.3462] device (tap12c93e5f-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.3469] device (tap12c93e5f-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.350 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[98e62085-2c2c-4274-94b0-a1ccee0f773d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 systemd[1]: Started Virtual Machine qemu-43-instance-00000036.
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.371 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a73c6a22-967c-47dd-9c9b-b220201c0bf8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.391 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb15c0c-996d-400e-a74e-8b0efb58cd03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.3982] manager: (tape49a07cb-10): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.397 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[92675e29-3df1-4114-9a9d-5e951a72c3e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.416 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb39f2b-ec62-4231-8795-d10046f367c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.418 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ce80b076-53ed-493d-ad8d-fbec770ce646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.4322] device (tape49a07cb-10): carrier: link connected
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.436 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0526f5-30a8-4fc2-94bd-6da1ce99790f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.446 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4f7ae3-ac8a-48ae-9878-c9bf086d33ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230248, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.456 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[10d38873-fd74-4be2-a11d-82be39c069eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:4d5a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537277, 'tstamp': 537277}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230249, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.466 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[e8851e91-4578-4eb7-91f3-3c463f9cccc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230250, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.488 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbd28d8-5f97-49e8-b079-d1bfc8f9ae0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.531 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[d32920aa-21ff-4ad9-b69e-55f65a260c6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.532 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.533 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.533 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape49a07cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:01 np0005601977 kernel: tape49a07cb-10: entered promiscuous mode
Jan 30 04:54:01 np0005601977 NetworkManager[55565]: <info>  [1769766841.5361] manager: (tape49a07cb-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.535 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.544 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape49a07cb-10, col_values=(('external_ids', {'iface-id': '1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.546 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:01Z|00535|binding|INFO|Releasing lport 1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb from this chassis (sb_readonly=0)
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.547 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.552 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.555 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e49a07cb-1da2-4f34-9999-d9ea635349ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e49a07cb-1da2-4f34-9999-d9ea635349ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.556 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[7c425ad8-733c-4581-92d8-097d60486954]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.557 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-e49a07cb-1da2-4f34-9999-d9ea635349ba
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/e49a07cb-1da2-4f34-9999-d9ea635349ba.pid.haproxy
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID e49a07cb-1da2-4f34-9999-d9ea635349ba
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:54:01 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:01.558 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'env', 'PROCESS_TAG=haproxy-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e49a07cb-1da2-4f34-9999-d9ea635349ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.758 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766841.7577186, 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.759 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] VM Started (Lifecycle Event)#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.785 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.789 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766841.7589061, 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.789 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.809 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.813 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.832 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.861 183134 DEBUG nova.compute.manager [req-fd2ef620-d0b0-45cb-a572-5a66adfe91c7 req-a1fd7991-1916-43db-9146-e02414682c62 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.862 183134 DEBUG oslo_concurrency.lockutils [req-fd2ef620-d0b0-45cb-a572-5a66adfe91c7 req-a1fd7991-1916-43db-9146-e02414682c62 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.862 183134 DEBUG oslo_concurrency.lockutils [req-fd2ef620-d0b0-45cb-a572-5a66adfe91c7 req-a1fd7991-1916-43db-9146-e02414682c62 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.862 183134 DEBUG oslo_concurrency.lockutils [req-fd2ef620-d0b0-45cb-a572-5a66adfe91c7 req-a1fd7991-1916-43db-9146-e02414682c62 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.862 183134 DEBUG nova.compute.manager [req-fd2ef620-d0b0-45cb-a572-5a66adfe91c7 req-a1fd7991-1916-43db-9146-e02414682c62 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Processing event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.863 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.867 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766841.8659894, 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.868 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.869 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.873 183134 INFO nova.virt.libvirt.driver [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Instance spawned successfully.#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.874 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.893 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:01 np0005601977 podman[230288]: 2026-01-30 09:54:01.896333276 +0000 UTC m=+0.052034968 container create 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.900 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.901 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.901 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.902 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.902 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.903 183134 DEBUG nova.virt.libvirt.driver [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.906 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:54:01 np0005601977 systemd[1]: Started libpod-conmon-6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4.scope.
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.940 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:54:01 np0005601977 podman[230288]: 2026-01-30 09:54:01.86567534 +0000 UTC m=+0.021377112 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.963 183134 INFO nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Took 5.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:54:01 np0005601977 nova_compute[183130]: 2026-01-30 09:54:01.963 183134 DEBUG nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:01 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:54:01 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad7c96ecad16a427a55f6d00d5eec647aa886cc7792671eee94cbf814b2e6bb3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:54:01 np0005601977 podman[230288]: 2026-01-30 09:54:01.981056938 +0000 UTC m=+0.136758650 container init 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:54:01 np0005601977 podman[230288]: 2026-01-30 09:54:01.984835216 +0000 UTC m=+0.140536918 container start 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 30 04:54:02 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [NOTICE]   (230307) : New worker (230309) forked
Jan 30 04:54:02 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [NOTICE]   (230307) : Loading success.
Jan 30 04:54:02 np0005601977 nova_compute[183130]: 2026-01-30 09:54:02.139 183134 INFO nova.compute.manager [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Took 6.21 seconds to build instance.#033[00m
Jan 30 04:54:02 np0005601977 nova_compute[183130]: 2026-01-30 09:54:02.160 183134 DEBUG oslo_concurrency.lockutils [None req-14294a1f-be62-425a-916c-651a5eb73e2d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:02 np0005601977 nova_compute[183130]: 2026-01-30 09:54:02.691 183134 DEBUG nova.network.neutron [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updated VIF entry in instance network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:54:02 np0005601977 nova_compute[183130]: 2026-01-30 09:54:02.692 183134 DEBUG nova.network.neutron [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:02 np0005601977 nova_compute[183130]: 2026-01-30 09:54:02.710 183134 DEBUG oslo_concurrency.lockutils [req-d6a253aa-b510-44dd-a941-1f1841daa583 req-cfe5bc3a-487d-433d-a4d4-c6b07b6d9a79 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.623 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.959 183134 DEBUG nova.compute.manager [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.960 183134 DEBUG oslo_concurrency.lockutils [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.960 183134 DEBUG oslo_concurrency.lockutils [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.961 183134 DEBUG oslo_concurrency.lockutils [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.961 183134 DEBUG nova.compute.manager [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] No waiting events found dispatching network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:54:03 np0005601977 nova_compute[183130]: 2026-01-30 09:54:03.961 183134 WARNING nova.compute.manager [req-4cc2ab84-1146-40a1-8cdb-6e7c16e24f32 req-6604b4d9-a1bf-4def-bc22-0697253a0f31 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received unexpected event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:54:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:04Z|00536|binding|INFO|Releasing lport 1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb from this chassis (sb_readonly=0)
Jan 30 04:54:04 np0005601977 NetworkManager[55565]: <info>  [1769766844.3546] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.355 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:04 np0005601977 NetworkManager[55565]: <info>  [1769766844.3564] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 30 04:54:04 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:04Z|00537|binding|INFO|Releasing lport 1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb from this chassis (sb_readonly=0)
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.365 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.793 183134 DEBUG nova.compute.manager [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.793 183134 DEBUG nova.compute.manager [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing instance network info cache due to event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.793 183134 DEBUG oslo_concurrency.lockutils [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.794 183134 DEBUG oslo_concurrency.lockutils [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:04 np0005601977 nova_compute[183130]: 2026-01-30 09:54:04.794 183134 DEBUG nova.network.neutron [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:54:05 np0005601977 nova_compute[183130]: 2026-01-30 09:54:05.839 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:06 np0005601977 nova_compute[183130]: 2026-01-30 09:54:06.700 183134 DEBUG nova.network.neutron [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updated VIF entry in instance network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:54:06 np0005601977 nova_compute[183130]: 2026-01-30 09:54:06.701 183134 DEBUG nova.network.neutron [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:06 np0005601977 nova_compute[183130]: 2026-01-30 09:54:06.725 183134 DEBUG oslo_concurrency.lockutils [req-ca421d9a-1982-48b5-9c38-969ef2a23930 req-9f979374-87b7-4c37-9645-1869ffd5176d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:07 np0005601977 podman[230319]: 2026-01-30 09:54:07.880056436 +0000 UTC m=+0.090131927 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 30 04:54:08 np0005601977 nova_compute[183130]: 2026-01-30 09:54:08.622 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:10 np0005601977 nova_compute[183130]: 2026-01-30 09:54:10.842 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:12Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:d3:5f 10.100.0.11
Jan 30 04:54:12 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:12Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:d3:5f 10.100.0.11
Jan 30 04:54:13 np0005601977 nova_compute[183130]: 2026-01-30 09:54:13.624 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:15 np0005601977 nova_compute[183130]: 2026-01-30 09:54:15.845 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:16 np0005601977 podman[230361]: 2026-01-30 09:54:16.862779082 +0000 UTC m=+0.075130278 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:54:18 np0005601977 nova_compute[183130]: 2026-01-30 09:54:18.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:20 np0005601977 nova_compute[183130]: 2026-01-30 09:54:20.847 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.372 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.374 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.374 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.544 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.621 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.623 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.679 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.808 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.809 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.21274948120117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.809 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.810 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.926 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.927 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.927 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.978 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:54:21 np0005601977 nova_compute[183130]: 2026-01-30 09:54:21.995 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:54:22 np0005601977 nova_compute[183130]: 2026-01-30 09:54:22.019 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:54:22 np0005601977 nova_compute[183130]: 2026-01-30 09:54:22.020 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:23 np0005601977 nova_compute[183130]: 2026-01-30 09:54:23.629 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.352 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.352 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.369 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.435 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.435 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.443 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.443 183134 INFO nova.compute.claims [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.563 183134 DEBUG nova.compute.provider_tree [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.578 183134 DEBUG nova.scheduler.client.report [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.605 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.606 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.654 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.655 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.671 183134 INFO nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.696 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.895 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.898 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.898 183134 INFO nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Creating image(s)#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.899 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.900 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.901 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:24 np0005601977 nova_compute[183130]: 2026-01-30 09:54:24.926 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.000 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.001 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.002 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.017 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.028 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.059 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.060 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.094 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.095 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.096 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.137 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.139 183134 DEBUG nova.virt.disk.api [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.139 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.181 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.182 183134 DEBUG nova.virt.disk.api [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.183 183134 DEBUG nova.objects.instance [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 5af7f9a7-c204-424e-9131-cf1ea6779f4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.222 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.222 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Ensure instance console log exists: /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.223 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.223 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.224 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.740 183134 DEBUG nova.policy [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:54:25 np0005601977 nova_compute[183130]: 2026-01-30 09:54:25.849 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:26 np0005601977 nova_compute[183130]: 2026-01-30 09:54:26.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:26 np0005601977 podman[230408]: 2026-01-30 09:54:26.855850939 +0000 UTC m=+0.070447925 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.openshift.expose-services=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 30 04:54:26 np0005601977 podman[230409]: 2026-01-30 09:54:26.869252562 +0000 UTC m=+0.079726660 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:54:28 np0005601977 nova_compute[183130]: 2026-01-30 09:54:28.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:28 np0005601977 nova_compute[183130]: 2026-01-30 09:54:28.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:28 np0005601977 nova_compute[183130]: 2026-01-30 09:54:28.631 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:28.810 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:28 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:28.811 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:54:28 np0005601977 nova_compute[183130]: 2026-01-30 09:54:28.812 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:29 np0005601977 nova_compute[183130]: 2026-01-30 09:54:29.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:29 np0005601977 nova_compute[183130]: 2026-01-30 09:54:29.594 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Successfully created port: 873d894a-9774-4a8d-a8a2-2c2163b9b63d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:54:30 np0005601977 nova_compute[183130]: 2026-01-30 09:54:30.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:30 np0005601977 nova_compute[183130]: 2026-01-30 09:54:30.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:54:30 np0005601977 nova_compute[183130]: 2026-01-30 09:54:30.851 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.311 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Successfully updated port: 873d894a-9774-4a8d-a8a2-2c2163b9b63d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.342 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.343 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.343 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.420 183134 DEBUG nova.compute.manager [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.420 183134 DEBUG nova.compute.manager [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing instance network info cache due to event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.421 183134 DEBUG oslo_concurrency.lockutils [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:31 np0005601977 nova_compute[183130]: 2026-01-30 09:54:31.495 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:54:31 np0005601977 podman[230451]: 2026-01-30 09:54:31.83792337 +0000 UTC m=+0.055195989 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:54:31 np0005601977 podman[230450]: 2026-01-30 09:54:31.844790226 +0000 UTC m=+0.060955043 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.361 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.747 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.748 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.748 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:54:32 np0005601977 nova_compute[183130]: 2026-01-30 09:54:32.748 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.228 183134 DEBUG nova.network.neutron [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updating instance_info_cache with network_info: [{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.248 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.249 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Instance network_info: |[{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.250 183134 DEBUG oslo_concurrency.lockutils [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.251 183134 DEBUG nova.network.neutron [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.257 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Start _get_guest_xml network_info=[{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.265 183134 WARNING nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.275 183134 DEBUG nova.virt.libvirt.host [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.276 183134 DEBUG nova.virt.libvirt.host [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.280 183134 DEBUG nova.virt.libvirt.host [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.281 183134 DEBUG nova.virt.libvirt.host [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.282 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.283 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.283 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.284 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.284 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.284 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.284 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.285 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.285 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.285 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.286 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.286 183134 DEBUG nova.virt.hardware [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.291 183134 DEBUG nova.virt.libvirt.vif [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-852311707',display_name='tempest-TestGettingAddress-server-852311707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-852311707',id=55,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-aq3pjkrt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:54:24Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=5af7f9a7-c204-424e-9131-cf1ea6779f4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.292 183134 DEBUG nova.network.os_vif_util [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.293 183134 DEBUG nova.network.os_vif_util [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.294 183134 DEBUG nova.objects.instance [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5af7f9a7-c204-424e-9131-cf1ea6779f4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.308 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <uuid>5af7f9a7-c204-424e-9131-cf1ea6779f4c</uuid>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <name>instance-00000037</name>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-852311707</nova:name>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:54:33</nova:creationTime>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        <nova:port uuid="873d894a-9774-4a8d-a8a2-2c2163b9b63d">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe74:bd29" ipVersion="6"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe74:bd29" ipVersion="6"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="serial">5af7f9a7-c204-424e-9131-cf1ea6779f4c</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="uuid">5af7f9a7-c204-424e-9131-cf1ea6779f4c</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.config"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:74:bd:29"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <target dev="tap873d894a-97"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/console.log" append="off"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:54:33 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:54:33 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:54:33 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:54:33 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.309 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Preparing to wait for external event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.310 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.311 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.312 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.313 183134 DEBUG nova.virt.libvirt.vif [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-852311707',display_name='tempest-TestGettingAddress-server-852311707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-852311707',id=55,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-aq3pjkrt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:54:24Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=5af7f9a7-c204-424e-9131-cf1ea6779f4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.313 183134 DEBUG nova.network.os_vif_util [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.314 183134 DEBUG nova.network.os_vif_util [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.314 183134 DEBUG os_vif [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.315 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.315 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.316 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.319 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.319 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap873d894a-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.320 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap873d894a-97, col_values=(('external_ids', {'iface-id': '873d894a-9774-4a8d-a8a2-2c2163b9b63d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:bd:29', 'vm-uuid': '5af7f9a7-c204-424e-9131-cf1ea6779f4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:33 np0005601977 NetworkManager[55565]: <info>  [1769766873.3227] manager: (tap873d894a-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.322 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.323 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.330 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.331 183134 INFO os_vif [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97')#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.378 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.379 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.379 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:74:bd:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.379 183134 INFO nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Using config drive#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.632 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.881 183134 INFO nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Creating config drive at /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.config#033[00m
Jan 30 04:54:33 np0005601977 nova_compute[183130]: 2026-01-30 09:54:33.885 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqxemmpx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.013 183134 DEBUG oslo_concurrency.processutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqxemmpx" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:54:34 np0005601977 kernel: tap873d894a-97: entered promiscuous mode
Jan 30 04:54:34 np0005601977 NetworkManager[55565]: <info>  [1769766874.0678] manager: (tap873d894a-97): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 30 04:54:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:34Z|00538|binding|INFO|Claiming lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d for this chassis.
Jan 30 04:54:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:34Z|00539|binding|INFO|873d894a-9774-4a8d-a8a2-2c2163b9b63d: Claiming fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.070 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:34Z|00540|binding|INFO|Setting lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d ovn-installed in OVS
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.077 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:34 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:34Z|00541|binding|INFO|Setting lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d up in Southbound
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.079 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.082 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], port_security=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe74:bd29/64 2001:db8::f816:3eff:fe74:bd29/64', 'neutron:device_id': '5af7f9a7-c204-424e-9131-cf1ea6779f4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=873d894a-9774-4a8d-a8a2-2c2163b9b63d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.085 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 873d894a-9774-4a8d-a8a2-2c2163b9b63d in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba bound to our chassis#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.089 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e49a07cb-1da2-4f34-9999-d9ea635349ba#033[00m
Jan 30 04:54:34 np0005601977 systemd-machined[154431]: New machine qemu-44-instance-00000037.
Jan 30 04:54:34 np0005601977 systemd-udevd[230511]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.104 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c1bb1ad8-150f-48e5-9cf8-1d8166898e52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 NetworkManager[55565]: <info>  [1769766874.1093] device (tap873d894a-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:54:34 np0005601977 NetworkManager[55565]: <info>  [1769766874.1101] device (tap873d894a-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:54:34 np0005601977 systemd[1]: Started Virtual Machine qemu-44-instance-00000037.
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.125 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf7bc3f-f301-4d64-bc72-48d6791a1a3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.128 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[e7607b41-5ce2-4a0d-bd8a-6ea77d7a9916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.148 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[aa63e221-f215-41b7-8ec7-efa642de4b70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.161 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[be414a5f-a6e2-42fe-9cff-ade0e0c61c72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 24, 'tx_packets': 5, 'rx_bytes': 2000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 24, 'tx_packets': 5, 'rx_bytes': 2000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 22, 'inoctets': 1608, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 22, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1608, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 22, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230523, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.171 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[985751f0-798d-4bdb-b90a-e6bf7272cc62]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537284, 'tstamp': 537284}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230524, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537286, 'tstamp': 537286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230524, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.172 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.174 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.176 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.176 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape49a07cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.177 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.177 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape49a07cb-10, col_values=(('external_ids', {'iface-id': '1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.177 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.365 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766874.3647292, 5af7f9a7-c204-424e-9131-cf1ea6779f4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.366 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] VM Started (Lifecycle Event)#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.391 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.396 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766874.3648815, 5af7f9a7-c204-424e-9131-cf1ea6779f4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.397 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.417 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.421 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.446 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:54:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:34.814 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.933 183134 DEBUG nova.compute.manager [req-f90a0399-83f5-4dca-b839-6bc0f8f80296 req-fdfdd33a-cf58-4c25-a8e8-b03b331844fa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.934 183134 DEBUG oslo_concurrency.lockutils [req-f90a0399-83f5-4dca-b839-6bc0f8f80296 req-fdfdd33a-cf58-4c25-a8e8-b03b331844fa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.934 183134 DEBUG oslo_concurrency.lockutils [req-f90a0399-83f5-4dca-b839-6bc0f8f80296 req-fdfdd33a-cf58-4c25-a8e8-b03b331844fa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.935 183134 DEBUG oslo_concurrency.lockutils [req-f90a0399-83f5-4dca-b839-6bc0f8f80296 req-fdfdd33a-cf58-4c25-a8e8-b03b331844fa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.935 183134 DEBUG nova.compute.manager [req-f90a0399-83f5-4dca-b839-6bc0f8f80296 req-fdfdd33a-cf58-4c25-a8e8-b03b331844fa dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Processing event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.942 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.947 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766874.9464762, 5af7f9a7-c204-424e-9131-cf1ea6779f4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.947 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.951 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.955 183134 INFO nova.virt.libvirt.driver [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Instance spawned successfully.#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.957 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.974 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.979 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.991 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.992 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.993 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.994 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.994 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:34 np0005601977 nova_compute[183130]: 2026-01-30 09:54:34.995 183134 DEBUG nova.virt.libvirt.driver [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.004 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.087 183134 INFO nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Took 10.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.088 183134 DEBUG nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.158 183134 INFO nova.compute.manager [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Took 10.74 seconds to build instance.#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.179 183134 DEBUG oslo_concurrency.lockutils [None req-357bbb7c-e8e1-434c-87d4-815af79a353d 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:35 np0005601977 nova_compute[183130]: 2026-01-30 09:54:35.990 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:36 np0005601977 nova_compute[183130]: 2026-01-30 09:54:36.005 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:36 np0005601977 nova_compute[183130]: 2026-01-30 09:54:36.005 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:54:36 np0005601977 nova_compute[183130]: 2026-01-30 09:54:36.758 183134 DEBUG nova.network.neutron [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updated VIF entry in instance network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:54:36 np0005601977 nova_compute[183130]: 2026-01-30 09:54:36.758 183134 DEBUG nova.network.neutron [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updating instance_info_cache with network_info: [{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:36 np0005601977 nova_compute[183130]: 2026-01-30 09:54:36.775 183134 DEBUG oslo_concurrency.lockutils [req-a84fb742-9433-4f5e-a56e-7cd2d5898c1c req-8e217594-2f81-48b7-a170-a72ada48b202 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.021 183134 DEBUG nova.compute.manager [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.022 183134 DEBUG oslo_concurrency.lockutils [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.022 183134 DEBUG oslo_concurrency.lockutils [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.022 183134 DEBUG oslo_concurrency.lockutils [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.023 183134 DEBUG nova.compute.manager [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] No waiting events found dispatching network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:54:37 np0005601977 nova_compute[183130]: 2026-01-30 09:54:37.023 183134 WARNING nova.compute.manager [req-fb20ea0a-7d6f-4999-bff0-f7c352a7badc req-5b1a5fa7-1422-4a12-bd10-bf19c0019e44 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received unexpected event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d for instance with vm_state active and task_state None.#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.322 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.634 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.860 183134 DEBUG nova.compute.manager [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.860 183134 DEBUG nova.compute.manager [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing instance network info cache due to event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.860 183134 DEBUG oslo_concurrency.lockutils [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.860 183134 DEBUG oslo_concurrency.lockutils [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:38 np0005601977 nova_compute[183130]: 2026-01-30 09:54:38.860 183134 DEBUG nova.network.neutron [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:54:38 np0005601977 podman[230533]: 2026-01-30 09:54:38.928468728 +0000 UTC m=+0.135864445 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:54:39 np0005601977 nova_compute[183130]: 2026-01-30 09:54:39.000 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:40 np0005601977 nova_compute[183130]: 2026-01-30 09:54:40.773 183134 DEBUG nova.network.neutron [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updated VIF entry in instance network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:54:40 np0005601977 nova_compute[183130]: 2026-01-30 09:54:40.773 183134 DEBUG nova.network.neutron [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updating instance_info_cache with network_info: [{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:40 np0005601977 nova_compute[183130]: 2026-01-30 09:54:40.800 183134 DEBUG oslo_concurrency.lockutils [req-e46bc16f-4228-46f1-94c3-4855f92f9081 req-736f38ca-8bd8-40dc-b068-b9d02f9e5156 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:43 np0005601977 nova_compute[183130]: 2026-01-30 09:54:43.325 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:43 np0005601977 nova_compute[183130]: 2026-01-30 09:54:43.637 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:47Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:bd:29 10.100.0.5
Jan 30 04:54:47 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:47Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:bd:29 10.100.0.5
Jan 30 04:54:47 np0005601977 podman[230570]: 2026-01-30 09:54:47.856722875 +0000 UTC m=+0.072389040 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:54:48 np0005601977 nova_compute[183130]: 2026-01-30 09:54:48.327 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:48 np0005601977 nova_compute[183130]: 2026-01-30 09:54:48.640 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:53 np0005601977 nova_compute[183130]: 2026-01-30 09:54:53.329 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:53 np0005601977 nova_compute[183130]: 2026-01-30 09:54:53.642 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.347 183134 DEBUG nova.compute.manager [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.347 183134 DEBUG nova.compute.manager [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing instance network info cache due to event network-changed-873d894a-9774-4a8d-a8a2-2c2163b9b63d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.348 183134 DEBUG oslo_concurrency.lockutils [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.348 183134 DEBUG oslo_concurrency.lockutils [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.348 183134 DEBUG nova.network.neutron [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Refreshing network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.410 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.410 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.411 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.415 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.416 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.416 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.416 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.417 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.418 183134 INFO nova.compute.manager [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Terminating instance#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.419 183134 DEBUG nova.compute.manager [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:54:57 np0005601977 kernel: tap873d894a-97 (unregistering): left promiscuous mode
Jan 30 04:54:57 np0005601977 NetworkManager[55565]: <info>  [1769766897.4489] device (tap873d894a-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00542|binding|INFO|Releasing lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d from this chassis (sb_readonly=0)
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00543|binding|INFO|Setting lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d down in Southbound
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00544|binding|INFO|Removing iface tap873d894a-97 ovn-installed in OVS
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.504 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.512 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], port_security=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe74:bd29/64 2001:db8::f816:3eff:fe74:bd29/64', 'neutron:device_id': '5af7f9a7-c204-424e-9131-cf1ea6779f4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=873d894a-9774-4a8d-a8a2-2c2163b9b63d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.515 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 873d894a-9774-4a8d-a8a2-2c2163b9b63d in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba unbound from our chassis#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.517 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.520 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e49a07cb-1da2-4f34-9999-d9ea635349ba#033[00m
Jan 30 04:54:57 np0005601977 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 30 04:54:57 np0005601977 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000037.scope: Consumed 12.418s CPU time.
Jan 30 04:54:57 np0005601977 systemd-machined[154431]: Machine qemu-44-instance-00000037 terminated.
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.538 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[30a81dbf-bce3-419e-868f-45083774cf41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 podman[230597]: 2026-01-30 09:54:57.555873988 +0000 UTC m=+0.083813847 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:54:57 np0005601977 podman[230600]: 2026-01-30 09:54:57.557130414 +0000 UTC m=+0.085763493 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.564 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a06a6722-3e1f-47c4-b967-b3beb99332e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.567 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[b68ba102-31d2-4856-8774-6cdf756cf21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.586 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2a29e72d-8e05-437c-a5a8-672280acd122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.598 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fc7eecf9-e286-408c-9584-a249653965d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 3468, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 3468, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 38, 'inoctets': 2768, 'indelivers': 13, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 38, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2768, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 38, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 13, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230648, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.612 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2ddd63ef-b11f-4307-9d29-7d558186d316]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537284, 'tstamp': 537284}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230649, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537286, 'tstamp': 537286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230649, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.614 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.616 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.619 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.620 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape49a07cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.621 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.621 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape49a07cb-10, col_values=(('external_ids', {'iface-id': '1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.622 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:57 np0005601977 kernel: tap873d894a-97: entered promiscuous mode
Jan 30 04:54:57 np0005601977 kernel: tap873d894a-97 (unregistering): left promiscuous mode
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00545|binding|INFO|Claiming lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d for this chassis.
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00546|binding|INFO|873d894a-9774-4a8d-a8a2-2c2163b9b63d: Claiming fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.642 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_controller[95460]: 2026-01-30T09:54:57Z|00547|binding|INFO|Releasing lport 873d894a-9774-4a8d-a8a2-2c2163b9b63d from this chassis (sb_readonly=0)
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.651 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], port_security=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe74:bd29/64 2001:db8::f816:3eff:fe74:bd29/64', 'neutron:device_id': '5af7f9a7-c204-424e-9131-cf1ea6779f4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=873d894a-9774-4a8d-a8a2-2c2163b9b63d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.651 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.653 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 873d894a-9774-4a8d-a8a2-2c2163b9b63d in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba bound to our chassis#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.654 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e49a07cb-1da2-4f34-9999-d9ea635349ba#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.658 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], port_security=['fa:16:3e:74:bd:29 10.100.0.5 2001:db8:0:1:f816:3eff:fe74:bd29 2001:db8::f816:3eff:fe74:bd29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe74:bd29/64 2001:db8::f816:3eff:fe74:bd29/64', 'neutron:device_id': '5af7f9a7-c204-424e-9131-cf1ea6779f4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=873d894a-9774-4a8d-a8a2-2c2163b9b63d) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.667 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[54bbd03a-86cc-4faf-952f-9199c5d41b68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.683 183134 INFO nova.virt.libvirt.driver [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Instance destroyed successfully.#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.684 183134 DEBUG nova.objects.instance [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 5af7f9a7-c204-424e-9131-cf1ea6779f4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.689 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5e9321-9f3e-4f91-a247-d950680f5d6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.691 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[339c1ec1-efc3-4619-be29-0964b54d15a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.697 183134 DEBUG nova.virt.libvirt.vif [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:54:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-852311707',display_name='tempest-TestGettingAddress-server-852311707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-852311707',id=55,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:54:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-aq3pjkrt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:54:35Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=5af7f9a7-c204-424e-9131-cf1ea6779f4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.698 183134 DEBUG nova.network.os_vif_util [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.702 183134 DEBUG nova.network.os_vif_util [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.703 183134 DEBUG os_vif [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.705 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.706 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap873d894a-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.708 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.710 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.712 183134 INFO os_vif [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:bd:29,bridge_name='br-int',has_traffic_filtering=True,id=873d894a-9774-4a8d-a8a2-2c2163b9b63d,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap873d894a-97')#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.713 183134 INFO nova.virt.libvirt.driver [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Deleting instance files /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c_del#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.713 183134 INFO nova.virt.libvirt.driver [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Deletion of /var/lib/nova/instances/5af7f9a7-c204-424e-9131-cf1ea6779f4c_del complete#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.716 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[2409dd42-27a7-4174-8109-ee3d21ddfe7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.730 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e6b80d-3843-4798-bc67-e4bfaa640d50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 42, 'tx_packets': 9, 'rx_bytes': 3468, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 42, 'tx_packets': 9, 'rx_bytes': 3468, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 38, 'inoctets': 2768, 'indelivers': 13, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 38, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2768, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 38, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 13, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230670, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.748 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dff48ca0-75d3-47cc-a0d4-dccc4e815567]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537284, 'tstamp': 537284}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230671, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537286, 'tstamp': 537286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230671, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.750 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.752 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.753 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.753 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape49a07cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.753 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.754 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape49a07cb-10, col_values=(('external_ids', {'iface-id': '1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.754 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.755 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 873d894a-9774-4a8d-a8a2-2c2163b9b63d in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba unbound from our chassis#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.756 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e49a07cb-1da2-4f34-9999-d9ea635349ba#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.761 183134 INFO nova.compute.manager [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.761 183134 DEBUG oslo.service.loopingcall [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.762 183134 DEBUG nova.compute.manager [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.762 183134 DEBUG nova.network.neutron [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.770 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[58c3c298-1622-425f-80c7-142e75dfe30c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.793 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[ba06fc39-871a-4dbe-80cc-aefbf3b5cc02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.796 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[39a0fe35-bbd2-41a3-8608-82c8ed0e2801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.826 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[61ec54d2-dba7-4877-905f-57c209b37724]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.846 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[c64cbfcb-7d7f-4921-9ecf-98315c6dbd56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape49a07cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:4d:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 42, 'tx_packets': 11, 'rx_bytes': 3468, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 42, 'tx_packets': 11, 'rx_bytes': 3468, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537277, 'reachable_time': 26772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 38, 'inoctets': 2768, 'indelivers': 13, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 38, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2768, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 38, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 13, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230677, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.862 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b33bab-1483-4cfa-b0c4-8368ccd4f846]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537284, 'tstamp': 537284}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230678, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape49a07cb-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537286, 'tstamp': 537286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230678, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.864 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.866 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 nova_compute[183130]: 2026-01-30 09:54:57.867 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.868 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape49a07cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.869 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.870 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape49a07cb-10, col_values=(('external_ids', {'iface-id': '1e30e9cc-f26a-4560-a44d-b4dc76d3c0eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:54:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:54:57.870 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.667 183134 DEBUG nova.network.neutron [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.686 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.689 183134 INFO nova.compute.manager [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Took 0.93 seconds to deallocate network for instance.#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.747 183134 DEBUG nova.compute.manager [req-5a221fc4-b7ce-454b-8999-4636b35fe90d req-7624d829-db1b-43ce-b96a-32aa5985e58c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-vif-deleted-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.753 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.753 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.825 183134 DEBUG nova.compute.provider_tree [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.847 183134 DEBUG nova.scheduler.client.report [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.873 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.910 183134 INFO nova.scheduler.client.report [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 5af7f9a7-c204-424e-9131-cf1ea6779f4c#033[00m
Jan 30 04:54:58 np0005601977 nova_compute[183130]: 2026-01-30 09:54:58.987 183134 DEBUG oslo_concurrency.lockutils [None req-238e1ac8-729f-493e-bf36-1cd6f2dfb308 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.335 183134 DEBUG nova.network.neutron [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updated VIF entry in instance network info cache for port 873d894a-9774-4a8d-a8a2-2c2163b9b63d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.336 183134 DEBUG nova.network.neutron [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Updating instance_info_cache with network_info: [{"id": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "address": "fa:16:3e:74:bd:29", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:bd29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap873d894a-97", "ovs_interfaceid": "873d894a-9774-4a8d-a8a2-2c2163b9b63d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.364 183134 DEBUG oslo_concurrency.lockutils [req-d1dc409f-8527-40bb-83f4-1aa91365400e req-769532ac-58cc-4020-980c-ca22aa7f5f41 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-5af7f9a7-c204-424e-9131-cf1ea6779f4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.439 183134 DEBUG nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-vif-unplugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.440 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.440 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.441 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.441 183134 DEBUG nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] No waiting events found dispatching network-vif-unplugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.442 183134 WARNING nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received unexpected event network-vif-unplugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.442 183134 DEBUG nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.443 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.443 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.443 183134 DEBUG oslo_concurrency.lockutils [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "5af7f9a7-c204-424e-9131-cf1ea6779f4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.444 183134 DEBUG nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] No waiting events found dispatching network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:54:59 np0005601977 nova_compute[183130]: 2026-01-30 09:54:59.444 183134 WARNING nova.compute.manager [req-a1664664-aa2e-4d3d-8654-e6910c69153e req-1f6e28bf-dd9f-4fbf-8361-334f433f5904 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Received unexpected event network-vif-plugged-873d894a-9774-4a8d-a8a2-2c2163b9b63d for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.304 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.305 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.306 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.307 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.307 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.309 183134 INFO nova.compute.manager [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Terminating instance#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.311 183134 DEBUG nova.compute.manager [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:55:00 np0005601977 kernel: tap12c93e5f-3f (unregistering): left promiscuous mode
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.336 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 NetworkManager[55565]: <info>  [1769766900.3380] device (tap12c93e5f-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:55:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:00Z|00548|binding|INFO|Releasing lport 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 from this chassis (sb_readonly=0)
Jan 30 04:55:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:00Z|00549|binding|INFO|Setting lport 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 down in Southbound
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.343 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:00Z|00550|binding|INFO|Removing iface tap12c93e5f-3f ovn-installed in OVS
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.345 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.349 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.356 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:d3:5f 10.100.0.11 2001:db8:0:1:f816:3eff:fece:d35f 2001:db8::f816:3eff:fece:d35f'], port_security=['fa:16:3e:ce:d3:5f 10.100.0.11 2001:db8:0:1:f816:3eff:fece:d35f 2001:db8::f816:3eff:fece:d35f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fece:d35f/64 2001:db8::f816:3eff:fece:d35f/64', 'neutron:device_id': '93393ec0-3300-4ba6-a539-ffaaa32ffdc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '748616c7-5270-4b3a-b8ec-d02da066836a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=592c9aa8-7a7c-408b-b30d-7e624e483665, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.358 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 in datapath e49a07cb-1da2-4f34-9999-d9ea635349ba unbound from our chassis#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.359 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e49a07cb-1da2-4f34-9999-d9ea635349ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.360 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[ac15c128-3e25-4c20-af40-a3a02c01cd4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.360 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba namespace which is not needed anymore#033[00m
Jan 30 04:55:00 np0005601977 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000036.scope: Deactivated successfully.
Jan 30 04:55:00 np0005601977 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000036.scope: Consumed 14.415s CPU time.
Jan 30 04:55:00 np0005601977 systemd-machined[154431]: Machine qemu-43-instance-00000036 terminated.
Jan 30 04:55:00 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [NOTICE]   (230307) : haproxy version is 2.8.14-c23fe91
Jan 30 04:55:00 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [NOTICE]   (230307) : path to executable is /usr/sbin/haproxy
Jan 30 04:55:00 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [WARNING]  (230307) : Exiting Master process...
Jan 30 04:55:00 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [ALERT]    (230307) : Current worker (230309) exited with code 143 (Terminated)
Jan 30 04:55:00 np0005601977 neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba[230303]: [WARNING]  (230307) : All workers exited. Exiting... (0)
Jan 30 04:55:00 np0005601977 systemd[1]: libpod-6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4.scope: Deactivated successfully.
Jan 30 04:55:00 np0005601977 podman[230701]: 2026-01-30 09:55:00.490891824 +0000 UTC m=+0.045287465 container died 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:55:00 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4-userdata-shm.mount: Deactivated successfully.
Jan 30 04:55:00 np0005601977 systemd[1]: var-lib-containers-storage-overlay-ad7c96ecad16a427a55f6d00d5eec647aa886cc7792671eee94cbf814b2e6bb3-merged.mount: Deactivated successfully.
Jan 30 04:55:00 np0005601977 podman[230701]: 2026-01-30 09:55:00.533114881 +0000 UTC m=+0.087510512 container cleanup 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:55:00 np0005601977 systemd[1]: libpod-conmon-6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4.scope: Deactivated successfully.
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.567 183134 INFO nova.virt.libvirt.driver [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Instance destroyed successfully.#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.567 183134 DEBUG nova.objects.instance [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.581 183134 DEBUG nova.virt.libvirt.vif [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1787753693',display_name='tempest-TestGettingAddress-server-1787753693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1787753693',id=54,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOnyiM4pBndaC9BevGYouaGGhacyKKVbrIisds2Goy+xQN3CKOZrIF+3ZMgXRir1UCFbJm87LU8JWc7YwuMxfpvTYMqstqZIyw52xOelCTwdMMXaw1ygIkhytJtbuNJeQQ==',key_name='tempest-TestGettingAddress-2057519988',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:54:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-gijkf33n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:54:02Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=93393ec0-3300-4ba6-a539-ffaaa32ffdc2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.582 183134 DEBUG nova.network.os_vif_util [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.583 183134 DEBUG nova.network.os_vif_util [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.583 183134 DEBUG os_vif [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.584 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.585 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12c93e5f-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.586 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.588 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.590 183134 INFO os_vif [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:d3:5f,bridge_name='br-int',has_traffic_filtering=True,id=12c93e5f-3f77-4417-a6c0-a9ddd740d0f9,network=Network(e49a07cb-1da2-4f34-9999-d9ea635349ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12c93e5f-3f')#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.590 183134 INFO nova.virt.libvirt.driver [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Deleting instance files /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2_del#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.590 183134 INFO nova.virt.libvirt.driver [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Deletion of /var/lib/nova/instances/93393ec0-3300-4ba6-a539-ffaaa32ffdc2_del complete#033[00m
Jan 30 04:55:00 np0005601977 podman[230736]: 2026-01-30 09:55:00.591392407 +0000 UTC m=+0.039457049 container remove 6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.594 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe2c288-595f-472f-9725-b98be690226d]: (4, ('Fri Jan 30 09:55:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba (6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4)\n6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4\nFri Jan 30 09:55:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba (6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4)\n6b7acb62b1fb0ecb725322c591ca66442f84e29ec1c6c8b944b99b90ca969cd4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.595 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a61a82f4-ce46-4c00-8081-7426b38d81b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.596 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape49a07cb-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.597 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 kernel: tape49a07cb-10: left promiscuous mode
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.602 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.604 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a9765ec2-1e2d-4e55-a640-da6ba97bd8fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.623 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[848564cc-1577-4963-be4c-ad12962a0cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.625 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[012da0d7-cc5f-4c1b-9139-6eb3879e6d8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.634 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[85aab729-3164-4079-9e84-712944b114d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537272, 'reachable_time': 33546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230764, 'error': None, 'target': 'ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 systemd[1]: run-netns-ovnmeta\x2de49a07cb\x2d1da2\x2d4f34\x2d9999\x2dd9ea635349ba.mount: Deactivated successfully.
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.638 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e49a07cb-1da2-4f34-9999-d9ea635349ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:55:00 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:00.638 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[4552f405-4e10-4cbb-b55c-77de7cd7fda0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.667 183134 INFO nova.compute.manager [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.667 183134 DEBUG oslo.service.loopingcall [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.668 183134 DEBUG nova.compute.manager [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.668 183134 DEBUG nova.network.neutron [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.864 183134 DEBUG nova.compute.manager [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.865 183134 DEBUG nova.compute.manager [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing instance network info cache due to event network-changed-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.865 183134 DEBUG oslo_concurrency.lockutils [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.866 183134 DEBUG oslo_concurrency.lockutils [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:55:00 np0005601977 nova_compute[183130]: 2026-01-30 09:55:00.866 183134 DEBUG nova.network.neutron [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Refreshing network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.550 183134 DEBUG nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-unplugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.551 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.551 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.551 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.552 183134 DEBUG nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] No waiting events found dispatching network-vif-unplugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.552 183134 DEBUG nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-unplugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.552 183134 DEBUG nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.552 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.553 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.553 183134 DEBUG oslo_concurrency.lockutils [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.553 183134 DEBUG nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] No waiting events found dispatching network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.553 183134 WARNING nova.compute.manager [req-10360dbb-167d-489e-8451-5e5d0dd41b90 req-ecad191b-2f78-4629-a00d-0eb8c51f8142 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received unexpected event network-vif-plugged-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 for instance with vm_state active and task_state deleting.#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.684 183134 DEBUG nova.network.neutron [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.705 183134 INFO nova.compute.manager [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Took 1.04 seconds to deallocate network for instance.#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.755 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.756 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.817 183134 DEBUG nova.compute.provider_tree [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.832 183134 DEBUG nova.scheduler.client.report [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.853 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.887 183134 INFO nova.scheduler.client.report [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 93393ec0-3300-4ba6-a539-ffaaa32ffdc2#033[00m
Jan 30 04:55:01 np0005601977 nova_compute[183130]: 2026-01-30 09:55:01.964 183134 DEBUG oslo_concurrency.lockutils [None req-c2831049-26b6-4755-bba3-b717a54d417a 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "93393ec0-3300-4ba6-a539-ffaaa32ffdc2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:02 np0005601977 nova_compute[183130]: 2026-01-30 09:55:02.723 183134 DEBUG nova.network.neutron [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updated VIF entry in instance network info cache for port 12c93e5f-3f77-4417-a6c0-a9ddd740d0f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:55:02 np0005601977 nova_compute[183130]: 2026-01-30 09:55:02.724 183134 DEBUG nova.network.neutron [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Updating instance_info_cache with network_info: [{"id": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "address": "fa:16:3e:ce:d3:5f", "network": {"id": "e49a07cb-1da2-4f34-9999-d9ea635349ba", "bridge": "br-int", "label": "tempest-network-smoke--293234321", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fece:d35f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12c93e5f-3f", "ovs_interfaceid": "12c93e5f-3f77-4417-a6c0-a9ddd740d0f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:55:02 np0005601977 nova_compute[183130]: 2026-01-30 09:55:02.762 183134 DEBUG oslo_concurrency.lockutils [req-0986ee6c-3664-458a-909d-7570712cfab7 req-7d9e229c-a76a-4f4f-a179-b48be39b8ed1 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-93393ec0-3300-4ba6-a539-ffaaa32ffdc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:55:02 np0005601977 podman[230766]: 2026-01-30 09:55:02.84585922 +0000 UTC m=+0.059823041 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:55:02 np0005601977 podman[230765]: 2026-01-30 09:55:02.846469467 +0000 UTC m=+0.059267145 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:55:02 np0005601977 nova_compute[183130]: 2026-01-30 09:55:02.956 183134 DEBUG nova.compute.manager [req-53460b79-37b5-48fa-9ff8-093b4d8bed8f req-dd953c0e-bc89-4cf1-86b3-d4b548643d1f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Received event network-vif-deleted-12c93e5f-3f77-4417-a6c0-a9ddd740d0f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:03 np0005601977 nova_compute[183130]: 2026-01-30 09:55:03.689 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:05 np0005601977 nova_compute[183130]: 2026-01-30 09:55:05.587 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:08 np0005601977 nova_compute[183130]: 2026-01-30 09:55:08.732 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:09 np0005601977 nova_compute[183130]: 2026-01-30 09:55:09.275 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:09 np0005601977 nova_compute[183130]: 2026-01-30 09:55:09.304 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:09 np0005601977 podman[230807]: 2026-01-30 09:55:09.927631427 +0000 UTC m=+0.133197938 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:55:10 np0005601977 nova_compute[183130]: 2026-01-30 09:55:10.589 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:12 np0005601977 nova_compute[183130]: 2026-01-30 09:55:12.681 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766897.679989, 5af7f9a7-c204-424e-9131-cf1ea6779f4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:55:12 np0005601977 nova_compute[183130]: 2026-01-30 09:55:12.681 183134 INFO nova.compute.manager [-] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:55:12 np0005601977 nova_compute[183130]: 2026-01-30 09:55:12.816 183134 DEBUG nova.compute.manager [None req-a56b407f-051d-4955-b6d6-fda57574936f - - - - - -] [instance: 5af7f9a7-c204-424e-9131-cf1ea6779f4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:13 np0005601977 nova_compute[183130]: 2026-01-30 09:55:13.734 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:15 np0005601977 nova_compute[183130]: 2026-01-30 09:55:15.565 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766900.5630918, 93393ec0-3300-4ba6-a539-ffaaa32ffdc2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:55:15 np0005601977 nova_compute[183130]: 2026-01-30 09:55:15.565 183134 INFO nova.compute.manager [-] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:55:15 np0005601977 nova_compute[183130]: 2026-01-30 09:55:15.587 183134 DEBUG nova.compute.manager [None req-c8c7eefb-ac5a-4b83-8475-86d3889ea490 - - - - - -] [instance: 93393ec0-3300-4ba6-a539-ffaaa32ffdc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:15 np0005601977 nova_compute[183130]: 2026-01-30 09:55:15.591 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:18 np0005601977 nova_compute[183130]: 2026-01-30 09:55:18.736 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:18 np0005601977 podman[230834]: 2026-01-30 09:55:18.840188346 +0000 UTC m=+0.058049140 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:55:20 np0005601977 nova_compute[183130]: 2026-01-30 09:55:20.593 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.374 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.375 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.375 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.590 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.591 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.2417984008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.592 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.592 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.660 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.661 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.680 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.692 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.711 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:55:21 np0005601977 nova_compute[183130]: 2026-01-30 09:55:21.711 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:23 np0005601977 nova_compute[183130]: 2026-01-30 09:55:23.738 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:25 np0005601977 nova_compute[183130]: 2026-01-30 09:55:25.596 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:25 np0005601977 nova_compute[183130]: 2026-01-30 09:55:25.711 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:26 np0005601977 nova_compute[183130]: 2026-01-30 09:55:26.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:27 np0005601977 podman[230861]: 2026-01-30 09:55:27.856197913 +0000 UTC m=+0.060713237 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:55:27 np0005601977 podman[230860]: 2026-01-30 09:55:27.856879192 +0000 UTC m=+0.067039947 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vendor=Red Hat, Inc.)
Jan 30 04:55:28 np0005601977 nova_compute[183130]: 2026-01-30 09:55:28.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:28 np0005601977 nova_compute[183130]: 2026-01-30 09:55:28.740 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:29 np0005601977 nova_compute[183130]: 2026-01-30 09:55:29.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:29 np0005601977 nova_compute[183130]: 2026-01-30 09:55:29.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.684 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:6c:5f 10.100.0.2 2001:db8::f816:3eff:fe0f:6c5f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe0f:6c5f/64', 'neutron:device_id': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aac1ed-7dad-4152-879a-9be32c3614e8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5eb0c483-295e-41b8-99f9-db990f69c678) old=Port_Binding(mac=['fa:16:3e:0f:6c:5f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.686 104706 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5eb0c483-295e-41b8-99f9-db990f69c678 in datapath 734c234c-1e07-4c56-b2d0-6f08a47eb16a updated#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.687 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 734c234c-1e07-4c56-b2d0-6f08a47eb16a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.688 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea83d62-2e9c-4463-8930-cef458ba1ac8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.748 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:55:29 np0005601977 nova_compute[183130]: 2026-01-30 09:55:29.749 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:29 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:29.750 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:55:30 np0005601977 nova_compute[183130]: 2026-01-30 09:55:30.597 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:32 np0005601977 nova_compute[183130]: 2026-01-30 09:55:32.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:32 np0005601977 nova_compute[183130]: 2026-01-30 09:55:32.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:55:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:32.752 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:33 np0005601977 nova_compute[183130]: 2026-01-30 09:55:33.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:33 np0005601977 nova_compute[183130]: 2026-01-30 09:55:33.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:55:33 np0005601977 nova_compute[183130]: 2026-01-30 09:55:33.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:55:33 np0005601977 nova_compute[183130]: 2026-01-30 09:55:33.360 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:55:33 np0005601977 nova_compute[183130]: 2026-01-30 09:55:33.741 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:33 np0005601977 podman[230901]: 2026-01-30 09:55:33.846949642 +0000 UTC m=+0.062390144 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:55:33 np0005601977 podman[230902]: 2026-01-30 09:55:33.850568956 +0000 UTC m=+0.063060204 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.271 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.271 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.302 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.383 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.383 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.390 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.391 183134 INFO nova.compute.claims [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.495 183134 DEBUG nova.compute.provider_tree [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.509 183134 DEBUG nova.scheduler.client.report [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.541 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.542 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.586 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.587 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.598 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.612 183134 INFO nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.648 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.749 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.751 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.752 183134 INFO nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Creating image(s)#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.753 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.753 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.755 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.779 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.818 183134 DEBUG nova.policy [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.827 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.828 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.828 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.852 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.934 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.935 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.962 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.963 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:35 np0005601977 nova_compute[183130]: 2026-01-30 09:55:35.964 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.017 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.019 183134 DEBUG nova.virt.disk.api [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.020 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.070 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.070 183134 DEBUG nova.virt.disk.api [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.071 183134 DEBUG nova.objects.instance [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 69cef024-06dd-442f-b13e-b1b446e6d2a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.084 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.084 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Ensure instance console log exists: /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.085 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.085 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.085 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.356 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:36 np0005601977 nova_compute[183130]: 2026-01-30 09:55:36.516 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Successfully created port: 8158beb1-bb0c-4018-b87f-889a7f7bfc30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.372 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.823 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Successfully updated port: 8158beb1-bb0c-4018-b87f-889a7f7bfc30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.841 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.842 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.842 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.922 183134 DEBUG nova.compute.manager [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.923 183134 DEBUG nova.compute.manager [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing instance network info cache due to event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:55:37 np0005601977 nova_compute[183130]: 2026-01-30 09:55:37.924 183134 DEBUG oslo_concurrency.lockutils [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:55:38 np0005601977 nova_compute[183130]: 2026-01-30 09:55:38.743 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:38 np0005601977 nova_compute[183130]: 2026-01-30 09:55:38.784 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.600 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:40 np0005601977 podman[230959]: 2026-01-30 09:55:40.864884165 +0000 UTC m=+0.082287083 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.932 183134 DEBUG nova.network.neutron [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.952 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.952 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Instance network_info: |[{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.953 183134 DEBUG oslo_concurrency.lockutils [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.953 183134 DEBUG nova.network.neutron [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing network info cache for port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.958 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Start _get_guest_xml network_info=[{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.963 183134 WARNING nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.969 183134 DEBUG nova.virt.libvirt.host [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.970 183134 DEBUG nova.virt.libvirt.host [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.973 183134 DEBUG nova.virt.libvirt.host [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.974 183134 DEBUG nova.virt.libvirt.host [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.975 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.976 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.976 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.977 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.977 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.977 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.978 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.978 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.978 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.979 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.979 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.980 183134 DEBUG nova.virt.hardware [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.986 183134 DEBUG nova.virt.libvirt.vif [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-292157088',display_name='tempest-TestGettingAddress-server-292157088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-292157088',id=56,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-7adzda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:55:35Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=69cef024-06dd-442f-b13e-b1b446e6d2a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.986 183134 DEBUG nova.network.os_vif_util [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:55:40 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.990 183134 DEBUG nova.network.os_vif_util [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:55:40 np0005601977 nova_compute[183130]: 2026-01-30 09:55:40.992 183134 DEBUG nova.objects.instance [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 69cef024-06dd-442f-b13e-b1b446e6d2a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:55:40 np0005601977 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.009 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <uuid>69cef024-06dd-442f-b13e-b1b446e6d2a7</uuid>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <name>instance-00000038</name>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-292157088</nova:name>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:55:40</nova:creationTime>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        <nova:port uuid="8158beb1-bb0c-4018-b87f-889a7f7bfc30">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fee6:90c9" ipVersion="6"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="serial">69cef024-06dd-442f-b13e-b1b446e6d2a7</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="uuid">69cef024-06dd-442f-b13e-b1b446e6d2a7</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.config"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:e6:90:c9"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <target dev="tap8158beb1-bb"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/console.log" append="off"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:55:41 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:55:41 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:55:41 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:55:41 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.010 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Preparing to wait for external event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.010 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.011 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.011 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.012 183134 DEBUG nova.virt.libvirt.vif [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-292157088',display_name='tempest-TestGettingAddress-server-292157088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-292157088',id=56,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-7adzda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:55:35Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=69cef024-06dd-442f-b13e-b1b446e6d2a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.013 183134 DEBUG nova.network.os_vif_util [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.014 183134 DEBUG nova.network.os_vif_util [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.015 183134 DEBUG os_vif [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.015 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.016 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.016 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.020 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.021 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8158beb1-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.022 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8158beb1-bb, col_values=(('external_ids', {'iface-id': '8158beb1-bb0c-4018-b87f-889a7f7bfc30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:90:c9', 'vm-uuid': '69cef024-06dd-442f-b13e-b1b446e6d2a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:41 np0005601977 NetworkManager[55565]: <info>  [1769766941.0247] manager: (tap8158beb1-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.023 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.027 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.029 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.030 183134 INFO os_vif [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb')#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.109 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.110 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.110 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:e6:90:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.110 183134 INFO nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Using config drive#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.963 183134 INFO nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Creating config drive at /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.config#033[00m
Jan 30 04:55:41 np0005601977 nova_compute[183130]: 2026-01-30 09:55:41.966 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr7kxvun execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.081 183134 DEBUG oslo_concurrency.processutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr7kxvun" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:55:42 np0005601977 kernel: tap8158beb1-bb: entered promiscuous mode
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.1267] manager: (tap8158beb1-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Jan 30 04:55:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:42Z|00551|binding|INFO|Claiming lport 8158beb1-bb0c-4018-b87f-889a7f7bfc30 for this chassis.
Jan 30 04:55:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:42Z|00552|binding|INFO|8158beb1-bb0c-4018-b87f-889a7f7bfc30: Claiming fa:16:3e:e6:90:c9 10.100.0.6 2001:db8::f816:3eff:fee6:90c9
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.127 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.130 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.133 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.143 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:90:c9 10.100.0.6 2001:db8::f816:3eff:fee6:90c9'], port_security=['fa:16:3e:e6:90:c9 10.100.0.6 2001:db8::f816:3eff:fee6:90c9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fee6:90c9/64', 'neutron:device_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87ba848e-f42a-46db-90b0-b00a185ea7f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aac1ed-7dad-4152-879a-9be32c3614e8, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=8158beb1-bb0c-4018-b87f-889a7f7bfc30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.144 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 in datapath 734c234c-1e07-4c56-b2d0-6f08a47eb16a bound to our chassis#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.145 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 734c234c-1e07-4c56-b2d0-6f08a47eb16a#033[00m
Jan 30 04:55:42 np0005601977 systemd-udevd[231005]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:55:42 np0005601977 systemd-machined[154431]: New machine qemu-45-instance-00000038.
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.154 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[0aef21fe-a5d6-47f4-8808-1ab1e56d9b27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.154 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap734c234c-11 in ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.154 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:42Z|00553|binding|INFO|Setting lport 8158beb1-bb0c-4018-b87f-889a7f7bfc30 ovn-installed in OVS
Jan 30 04:55:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:42Z|00554|binding|INFO|Setting lport 8158beb1-bb0c-4018-b87f-889a7f7bfc30 up in Southbound
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.1566] device (tap8158beb1-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.1577] device (tap8158beb1-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.159 211716 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap734c234c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.159 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a187e0d7-fde6-4b5e-bd6c-4c3efafa6a99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.160 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[760071a7-e333-4403-b977-a6fe9af22a0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 systemd[1]: Started Virtual Machine qemu-45-instance-00000038.
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.171 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[9c41c7ee-48af-4be1-bfac-a2884bdfb388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.180 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[fb35e936-37c3-427b-896c-63165acdb696]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.198 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d6e41c-7323-40f7-92f6-71eb6f186187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.202 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[70d4627a-f907-48fe-8925-b3461874ec92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.2030] manager: (tap734c234c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Jan 30 04:55:42 np0005601977 systemd-udevd[231009]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.221 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd33792-3004-4ffc-8f8f-5a94bbb6c2d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.225 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[90d01b03-cea4-4a65-a827-6c4fdc8f6ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.2366] device (tap734c234c-10): carrier: link connected
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.239 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[9073e9b3-080c-480e-82fc-9351b751a125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.249 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[40571c94-65ac-4d2b-9e56-73547c3cc8bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap734c234c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:6c:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547357, 'reachable_time': 44791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231039, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.259 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[336d379f-30ec-464b-940f-d106205822db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:6c5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547357, 'tstamp': 547357}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231040, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.270 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[707d96e1-1ce7-4674-8d2b-93a55f207fca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap734c234c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:6c:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547357, 'reachable_time': 44791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231041, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.292 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[eab7499e-db87-4244-a1c9-afd8ae56a4e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.330 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[948eda2e-2d88-421c-9677-ee74cac48c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.331 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap734c234c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.331 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.332 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap734c234c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:42 np0005601977 NetworkManager[55565]: <info>  [1769766942.3350] manager: (tap734c234c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 30 04:55:42 np0005601977 kernel: tap734c234c-10: entered promiscuous mode
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.335 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.339 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap734c234c-10, col_values=(('external_ids', {'iface-id': '5eb0c483-295e-41b8-99f9-db990f69c678'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.341 104706 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/734c234c-1e07-4c56-b2d0-6f08a47eb16a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/734c234c-1e07-4c56-b2d0-6f08a47eb16a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 30 04:55:42 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:42Z|00555|binding|INFO|Releasing lport 5eb0c483-295e-41b8-99f9-db990f69c678 from this chassis (sb_readonly=0)
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.340 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.342 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[1029cc80-5885-4aad-89bc-6644b934509a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.343 104706 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: global
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    log         /dev/log local0 debug
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    log-tag     haproxy-metadata-proxy-734c234c-1e07-4c56-b2d0-6f08a47eb16a
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    user        root
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    group       root
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    maxconn     1024
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    pidfile     /var/lib/neutron/external/pids/734c234c-1e07-4c56-b2d0-6f08a47eb16a.pid.haproxy
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    daemon
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: defaults
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    log global
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    mode http
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    option httplog
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    option dontlognull
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    option http-server-close
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    option forwardfor
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    retries                 3
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    timeout http-request    30s
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    timeout connect         30s
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    timeout client          32s
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    timeout server          32s
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    timeout http-keep-alive 30s
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: listen listener
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    bind 169.254.169.254:80
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    server metadata /var/lib/neutron/metadata_proxy
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]:    http-request add-header X-OVN-Network-ID 734c234c-1e07-4c56-b2d0-6f08a47eb16a
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 30 04:55:42 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:42.344 104706 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'env', 'PROCESS_TAG=haproxy-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/734c234c-1e07-4c56-b2d0-6f08a47eb16a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.344 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.480 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766942.4800308, 69cef024-06dd-442f-b13e-b1b446e6d2a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.481 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] VM Started (Lifecycle Event)#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.498 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.504 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766942.4811952, 69cef024-06dd-442f-b13e-b1b446e6d2a7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.504 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.524 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.528 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.546 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:55:42 np0005601977 podman[231080]: 2026-01-30 09:55:42.672652409 +0000 UTC m=+0.047553850 container create 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:55:42 np0005601977 systemd[1]: Started libpod-conmon-684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa.scope.
Jan 30 04:55:42 np0005601977 systemd[1]: Started libcrun container.
Jan 30 04:55:42 np0005601977 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b51a5db252030745d67f4cc7aa85671eec2009da545d6c059227ecdc98fb9c84/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:55:42 np0005601977 podman[231080]: 2026-01-30 09:55:42.648283493 +0000 UTC m=+0.023184924 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:55:42 np0005601977 podman[231080]: 2026-01-30 09:55:42.752505352 +0000 UTC m=+0.127406823 container init 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:55:42 np0005601977 podman[231080]: 2026-01-30 09:55:42.760572493 +0000 UTC m=+0.135473914 container start 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:55:42 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [NOTICE]   (231099) : New worker (231101) forked
Jan 30 04:55:42 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [NOTICE]   (231099) : Loading success.
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.995 183134 DEBUG nova.compute.manager [req-e64c1672-a338-4b2e-a7c0-4755c55e5ca5 req-bba40ccf-c936-4375-bfe4-223e63fe3364 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.996 183134 DEBUG oslo_concurrency.lockutils [req-e64c1672-a338-4b2e-a7c0-4755c55e5ca5 req-bba40ccf-c936-4375-bfe4-223e63fe3364 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.996 183134 DEBUG oslo_concurrency.lockutils [req-e64c1672-a338-4b2e-a7c0-4755c55e5ca5 req-bba40ccf-c936-4375-bfe4-223e63fe3364 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.996 183134 DEBUG oslo_concurrency.lockutils [req-e64c1672-a338-4b2e-a7c0-4755c55e5ca5 req-bba40ccf-c936-4375-bfe4-223e63fe3364 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.996 183134 DEBUG nova.compute.manager [req-e64c1672-a338-4b2e-a7c0-4755c55e5ca5 req-bba40ccf-c936-4375-bfe4-223e63fe3364 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Processing event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:55:42 np0005601977 nova_compute[183130]: 2026-01-30 09:55:42.997 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.000 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766943.0003083, 69cef024-06dd-442f-b13e-b1b446e6d2a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.000 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.002 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.005 183134 INFO nova.virt.libvirt.driver [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Instance spawned successfully.#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.006 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.024 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.027 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.038 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.039 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.040 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.041 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.041 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.042 183134 DEBUG nova.virt.libvirt.driver [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.052 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.109 183134 INFO nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Took 7.36 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.109 183134 DEBUG nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.184 183134 INFO nova.compute.manager [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Took 7.83 seconds to build instance.#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.225 183134 DEBUG oslo_concurrency.lockutils [None req-cf4257ce-2541-4c36-ae98-e8adeb1967cd 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.954s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:43 np0005601977 nova_compute[183130]: 2026-01-30 09:55:43.777 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:44 np0005601977 nova_compute[183130]: 2026-01-30 09:55:44.942 183134 DEBUG nova.network.neutron [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updated VIF entry in instance network info cache for port 8158beb1-bb0c-4018-b87f-889a7f7bfc30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:55:44 np0005601977 nova_compute[183130]: 2026-01-30 09:55:44.943 183134 DEBUG nova.network.neutron [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:55:44 np0005601977 nova_compute[183130]: 2026-01-30 09:55:44.959 183134 DEBUG oslo_concurrency.lockutils [req-dabe2340-83b9-4b76-a665-ac9b18067fa4 req-e579cf64-b41e-4be4-9466-cb2001bbf47a dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.072 183134 DEBUG nova.compute.manager [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.072 183134 DEBUG oslo_concurrency.lockutils [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.073 183134 DEBUG oslo_concurrency.lockutils [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.073 183134 DEBUG oslo_concurrency.lockutils [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.074 183134 DEBUG nova.compute.manager [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] No waiting events found dispatching network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:55:45 np0005601977 nova_compute[183130]: 2026-01-30 09:55:45.074 183134 WARNING nova.compute.manager [req-fb688a0f-6863-4e0b-a197-225a136f6607 req-279eede8-fc22-4ab4-a7bf-2c1b2e0b3a9e dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received unexpected event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 for instance with vm_state active and task_state None.#033[00m
Jan 30 04:55:46 np0005601977 nova_compute[183130]: 2026-01-30 09:55:46.024 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:46Z|00556|binding|INFO|Releasing lport 5eb0c483-295e-41b8-99f9-db990f69c678 from this chassis (sb_readonly=0)
Jan 30 04:55:46 np0005601977 NetworkManager[55565]: <info>  [1769766946.8065] manager: (patch-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 30 04:55:46 np0005601977 NetworkManager[55565]: <info>  [1769766946.8076] manager: (patch-br-int-to-provnet-8f19b27c-970a-4b38-8a34-7e9bf6e0439d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 30 04:55:46 np0005601977 nova_compute[183130]: 2026-01-30 09:55:46.808 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:46 np0005601977 nova_compute[183130]: 2026-01-30 09:55:46.828 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:46 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:46Z|00557|binding|INFO|Releasing lport 5eb0c483-295e-41b8-99f9-db990f69c678 from this chassis (sb_readonly=0)
Jan 30 04:55:46 np0005601977 nova_compute[183130]: 2026-01-30 09:55:46.839 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:47 np0005601977 nova_compute[183130]: 2026-01-30 09:55:47.220 183134 DEBUG nova.compute.manager [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:55:47 np0005601977 nova_compute[183130]: 2026-01-30 09:55:47.221 183134 DEBUG nova.compute.manager [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing instance network info cache due to event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:55:47 np0005601977 nova_compute[183130]: 2026-01-30 09:55:47.221 183134 DEBUG oslo_concurrency.lockutils [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:55:47 np0005601977 nova_compute[183130]: 2026-01-30 09:55:47.221 183134 DEBUG oslo_concurrency.lockutils [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:55:47 np0005601977 nova_compute[183130]: 2026-01-30 09:55:47.221 183134 DEBUG nova.network.neutron [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing network info cache for port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:55:48 np0005601977 nova_compute[183130]: 2026-01-30 09:55:48.781 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:49 np0005601977 nova_compute[183130]: 2026-01-30 09:55:49.225 183134 DEBUG nova.network.neutron [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updated VIF entry in instance network info cache for port 8158beb1-bb0c-4018-b87f-889a7f7bfc30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:55:49 np0005601977 nova_compute[183130]: 2026-01-30 09:55:49.226 183134 DEBUG nova.network.neutron [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:55:49 np0005601977 nova_compute[183130]: 2026-01-30 09:55:49.242 183134 DEBUG oslo_concurrency.lockutils [req-9f6154a7-e91a-4eba-8d6f-d12cb858e8bb req-4571bddc-1ec3-4e8d-8b1f-0a724e994b8f dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:55:49 np0005601977 podman[231111]: 2026-01-30 09:55:49.843903664 +0000 UTC m=+0.060376007 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:55:51 np0005601977 nova_compute[183130]: 2026-01-30 09:55:51.058 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:53 np0005601977 nova_compute[183130]: 2026-01-30 09:55:53.783 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'name': 'tempest-TestGettingAddress-server-292157088', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'hostId': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.460 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 69cef024-06dd-442f-b13e-b1b446e6d2a7 / tap8158beb1-bb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.460 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '034910b1-bcd9-4f9d-8234-9195e9a1f774', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.458092', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'dea8ec20-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': '412ce0964f79eba3f5ff3a4da3c2c2282a08b3da37c061df0b683599c3111219'}]}, 'timestamp': '2026-01-30 09:55:55.461503', '_unique_id': '4eb6c333e9064947816c408179c5ca0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.462 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.463 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.473 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.474 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b605b9c-e47d-4945-ba0c-cd2dfc9255b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.463409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deaae3ea-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': 'fa1c9d3193996c64096848934d12ea42703cebf27bdcfb9fed5c7cda9d8c8c5c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.463409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deaaf060-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': '8e177b7d05017bab1834a672f9b1b81951932a4b7781819dc492cfc57cf767dc'}]}, 'timestamp': '2026-01-30 09:55:55.474662', '_unique_id': '18c640d4640546359857ea1fd91c3bc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.475 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.476 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.499 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.499 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c73b12cc-e393-4687-912d-3061cb23e894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.476561', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deaec88e-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '5ce754d72762534ff32fd2018fd3377ada05fe0049aa703faad958dc7552b264'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.476561', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deaed6bc-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '817fed8fdf9573581088e49fc243257e10daa970914856ceca7f725240491baf'}]}, 'timestamp': '2026-01-30 09:55:55.500302', '_unique_id': '10a3c69ad60d4db8980b51b07f43f21c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.501 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>]
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>]
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '554ffeae-b404-41a1-adb3-62c819aacb78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.503039', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deaf511e-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'f36541826ab6c5a9d2079c4b3c65b64f1381ce07e447b372c798da11b4238eb4'}]}, 'timestamp': '2026-01-30 09:55:55.503375', '_unique_id': 'cc7b3880849e4d9d9b0cb41d3eb3f306'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.521 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f2a24e-6802-4762-8ca2-9b9975191ad5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'timestamp': '2026-01-30T09:55:55.504976', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'deb218fe-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.915986304, 'message_signature': 'b362456167bb372d0d12539aa59bb877a4acab5c486be033b71a4589d8da5c39'}]}, 'timestamp': '2026-01-30 09:55:55.521631', '_unique_id': '81e6fe99b5974ce0abb841f02679ecd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.523 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.523 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c81b109-5f0e-4779-801c-7a9b974b345f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.523473', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb26e6c-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': '9a906162875bd94c28663e4a9a7e6432743ba1848b34c25ba5b14bd287272d07'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.523473', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb279c0-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': '079cda98fe020a9444b44733cfb8330c2e1560bd2edaa820dd706cd4f930d5f1'}]}, 'timestamp': '2026-01-30 09:55:55.524069', '_unique_id': '71b9685e059a4b0ea66462c7ecf04f45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.525 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.latency volume: 1854697560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19b572b1-330e-4987-a269-7e3ce5ffc73a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1854697560, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.525697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb2c542-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'fb30c21523650beac1fd95afcd676cc00df279ca2a3e80dfa93175d865421432'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.525697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb2d06e-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'bbf32bfe5d607debce27e7dfadc3fb3446526fdb7871d56da88352f43d6c5d83'}]}, 'timestamp': '2026-01-30 09:55:55.526305', '_unique_id': '3b2b1b76b3aa4c4ca893d8e8b8507974'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.527 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.bytes volume: 31013376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.528 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f683b31b-b5f6-4769-bcab-4c9563becd41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31013376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.527864', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb31a24-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '4619958e64c51579ec718d890be5d85e70d883b8c4490aaf704870d69130bf36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.527864', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb32780-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'd41b3a59ac7d5e43fc518cd589a0a75ebad9182c742af9bff97bdfc4f19b8643'}]}, 'timestamp': '2026-01-30 09:55:55.528520', '_unique_id': '1469b98b493740dab034d43f342b59c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.530 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdf903d4-6de1-4b85-9361-b2726e845ed4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.530156', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb37550-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'a4c9fe81024067d99200ed215f03b5559c7539b869adbb961539c7348c6c87b1'}]}, 'timestamp': '2026-01-30 09:55:55.530523', '_unique_id': '0db18f42ac4f451e8a6092a3760e0b92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.532 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa3084f-b6d3-4bfd-9a95-52581a7982a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.532119', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb3c2f8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': '9ae1da1951249e4dab053100589a60191cc7ca065b7f6c969a96257f2f45d5c0'}]}, 'timestamp': '2026-01-30 09:55:55.532515', '_unique_id': '148ac636088f4097bc54e7dd4e189b27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.534 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.534 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>]
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.534 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91690fe0-cae1-4d38-90f2-0a5ce6808564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.534619', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb42202-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'e9c6eaa4d092efe8805981e0b1135e478b1fed7d52dbdcdaf3a911333690c2f8'}]}, 'timestamp': '2026-01-30 09:55:55.534945', '_unique_id': 'dce499d3c1ee4f788915d3b1a24f5fc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.536 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.536 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd03e14a-29d6-4e5e-b4cb-5736f7116a01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.536609', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb46ef6-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': 'a4e09bc6015a71c2e994b7856d4d9ff426533065260d2c04ed1cd884df1be53f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.536609', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb477fc-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.858289065, 'message_signature': '2a1de10dcfd8a09a8067363e0f7d7476261dadaa9f599fbd1e1854f7fc6a48eb'}]}, 'timestamp': '2026-01-30 09:55:55.537122', '_unique_id': 'a9ee0042e62c4121ad6cfe02dfe1c706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.538 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5417fd55-cdad-42b8-afe2-0084ce3a842e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.538723', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb4c266-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'ec9e17d7d018fcb8fb7ad1bc9b8e2c602acebfe65339f901cc90d41dd6b8a6f6'}]}, 'timestamp': '2026-01-30 09:55:55.539052', '_unique_id': 'a2179673dadd4df290fdabda6e3dec93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.540 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33297f38-a663-4901-bc9d-ff58a771adc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.540838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb51518-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': '4f837b4a96e5ad2903d9f47fa91e547cf5d2ece75e4ee2f63472c7cadd6e1fd3'}]}, 'timestamp': '2026-01-30 09:55:55.541185', '_unique_id': '533c4a7e7de34859a530c3e79e231aff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.542 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/cpu volume: 10790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ad62b9c-2a90-4494-b91d-1f41eb69e333', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10790000000, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'timestamp': '2026-01-30T09:55:55.542958', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'deb567ca-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.915986304, 'message_signature': 'b9011b6bde9ac80988c0f24f7e58b73840435f9cd87783f9b680f4ec9e5a550a'}]}, 'timestamp': '2026-01-30 09:55:55.543293', '_unique_id': '32d9ac51603b4c0e8c6edcb9a7987271'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.544 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.latency volume: 590881611 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.545 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.latency volume: 54715476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d20a779-59b1-48ff-9bd5-e2a25a53768a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 590881611, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.544873', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb5b2ac-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'bbafc2930ad90e0b47b6d5fd471190ed2a926a06b08a0b2b20fc6e4b531afacd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54715476, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.544873', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb5bf2c-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'fb737596f469a987e652e2d0477c2bff781445ae50d2965d342c7fbf1cc7d9db'}]}, 'timestamp': '2026-01-30 09:55:55.545508', '_unique_id': 'd503573f63524f2b8c7b60db73cbfc38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.546 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.incoming.bytes volume: 918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41d25029-4a84-46df-afe6-b4c82b0b6f37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 918, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.547112', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb60ab8-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'bfe10e35086a2f840d04c8baf0acd2c1bb66db2e2aa30731e53ec88aa80925a4'}]}, 'timestamp': '2026-01-30 09:55:55.547459', '_unique_id': '2632ab86112c4b6ca1659229f4de5903'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45be68eb-0abb-447d-8e29-519a30b4dea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.549040', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb65644-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': '1ca282fabd20d2564c674d67a6d7e0ce5a0c58c0775c349097db9027a4ead5c4'}]}, 'timestamp': '2026-01-30 09:55:55.549387', '_unique_id': 'd0e97337e29d4192bb90441214062aca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.551 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.551 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-292157088>]
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.551 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9b7d863-ca0b-4408-a053-2e67a2f8717b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': 'instance-00000038-69cef024-06dd-442f-b13e-b1b446e6d2a7-tap8158beb1-bb', 'timestamp': '2026-01-30T09:55:55.551659', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'tap8158beb1-bb', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:90:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8158beb1-bb'}, 'message_id': 'deb6bbc0-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.852981413, 'message_signature': 'f750852a9216dbc5c1e5e313c17e602c1e003a5ca11530af0cf125edeacb0953'}]}, 'timestamp': '2026-01-30 09:55:55.551989', '_unique_id': '1c277bbfb96a45bdbac551e1019f9460'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.554 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.requests volume: 279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.554 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c339647f-185f-453c-8848-2fe87c6df107', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 279, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.554027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb71c32-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': 'fd7d14ce9922c3580c4246ed82ee82832322d0af99ab6a5b2a733f6735fe2a3a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.554027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb72740-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '205487f51bcb994570e3066f9540245683f8262ea54ad6d64fc5028608fb53b9'}]}, 'timestamp': '2026-01-30 09:55:55.554717', '_unique_id': '523c06e7d29f4e1aa78cadf890809118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.556 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.requests volume: 1134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.557 12 DEBUG ceilometer.compute.pollsters [-] 69cef024-06dd-442f-b13e-b1b446e6d2a7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa0c1259-2782-40dc-9831-be65734dca6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1134, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-vda', 'timestamp': '2026-01-30T09:55:55.556724', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'deb7810e-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '1dbdf4ee040c4961ca17f6158ddeb2cd73e91a944a5695b93d44d5a3844d67be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_name': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_name': None, 'resource_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7-sda', 'timestamp': '2026-01-30T09:55:55.556724', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-292157088', 'name': 'instance-00000038', 'instance_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'instance_type': 'm1.nano', 'host': '8d00dad7134370770f049eab2d12bd1483de8692b7a86f804cd78989', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '43faf4bc-65eb-437f-b3dc-707ebe898840', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}, 'image_ref': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'deb78b7c-fdc1-11f0-a471-fa163eabe782', 'monotonic_time': 5486.871461281, 'message_signature': '427536a5454dbb7c15bbd1443bd6631de96be35d70f6691871934ee430f7f526'}]}, 'timestamp': '2026-01-30 09:55:55.557313', '_unique_id': '0308ca6348a34359825a3488f1c8fcc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     yield
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 30 04:55:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:55:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 30 04:55:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:55Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:90:c9 10.100.0.6
Jan 30 04:55:55 np0005601977 ovn_controller[95460]: 2026-01-30T09:55:55Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:90:c9 10.100.0.6
Jan 30 04:55:56 np0005601977 nova_compute[183130]: 2026-01-30 09:55:56.061 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:57.411 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:57.412 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:55:57.413 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:55:58 np0005601977 nova_compute[183130]: 2026-01-30 09:55:58.785 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:55:58 np0005601977 podman[231150]: 2026-01-30 09:55:58.849483331 +0000 UTC m=+0.067880841 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 30 04:55:58 np0005601977 podman[231149]: 2026-01-30 09:55:58.859349853 +0000 UTC m=+0.078474134 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc.)
Jan 30 04:56:01 np0005601977 nova_compute[183130]: 2026-01-30 09:56:01.098 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:03 np0005601977 nova_compute[183130]: 2026-01-30 09:56:03.789 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:04 np0005601977 podman[231187]: 2026-01-30 09:56:04.83905271 +0000 UTC m=+0.047856229 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:56:04 np0005601977 podman[231186]: 2026-01-30 09:56:04.839992196 +0000 UTC m=+0.053891571 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.098 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.099 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.116 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.199 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.200 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.214 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.214 183134 INFO nova.compute.claims [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Claim successful on node compute-0.ctlplane.example.com#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.350 183134 DEBUG nova.compute.provider_tree [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.368 183134 DEBUG nova.scheduler.client.report [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.391 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.392 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.436 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.437 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.455 183134 INFO nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.478 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.577 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.579 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.579 183134 INFO nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Creating image(s)#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.580 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.581 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.582 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.610 183134 DEBUG nova.policy [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.616 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.689 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.690 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.692 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.714 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.777 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.779 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.807 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.808 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.808 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.881 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.882 183134 DEBUG nova.virt.disk.api [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.883 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.957 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.959 183134 DEBUG nova.virt.disk.api [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.960 183134 DEBUG nova.objects.instance [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 36e7106c-ee0f-41ee-a9b1-c2f98104603b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.977 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.978 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Ensure instance console log exists: /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.978 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.979 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:05 np0005601977 nova_compute[183130]: 2026-01-30 09:56:05.979 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:06 np0005601977 nova_compute[183130]: 2026-01-30 09:56:06.101 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:06 np0005601977 nova_compute[183130]: 2026-01-30 09:56:06.633 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Successfully created port: b32935e8-f47f-4a5d-978a-edd74286bcab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.424 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Successfully updated port: b32935e8-f47f-4a5d-978a-edd74286bcab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.441 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.442 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.442 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.538 183134 DEBUG nova.compute.manager [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.539 183134 DEBUG nova.compute.manager [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing instance network info cache due to event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.540 183134 DEBUG oslo_concurrency.lockutils [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:07 np0005601977 nova_compute[183130]: 2026-01-30 09:56:07.642 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.573 183134 DEBUG nova.network.neutron [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updating instance_info_cache with network_info: [{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.595 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.596 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Instance network_info: |[{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.597 183134 DEBUG oslo_concurrency.lockutils [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.598 183134 DEBUG nova.network.neutron [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.603 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Start _get_guest_xml network_info=[{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'encryption_format': None, 'image_id': 'ab7cf61b-98df-4a10-83fd-7d23191f2bba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.609 183134 WARNING nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.614 183134 DEBUG nova.virt.libvirt.host [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.614 183134 DEBUG nova.virt.libvirt.host [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.620 183134 DEBUG nova.virt.libvirt.host [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.621 183134 DEBUG nova.virt.libvirt.host [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.622 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.622 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-30T09:21:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43faf4bc-65eb-437f-b3dc-707ebe898840',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-30T09:21:58Z,direct_url=<?>,disk_format='qcow2',id=ab7cf61b-98df-4a10-83fd-7d23191f2bba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='27007ec4b64c41aeab1605ec0d373102',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-30T09:21:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.622 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.622 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.623 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.623 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.623 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.623 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.624 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.624 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.624 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.624 183134 DEBUG nova.virt.hardware [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.628 183134 DEBUG nova.virt.libvirt.vif [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:56:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1712168096',display_name='tempest-TestGettingAddress-server-1712168096',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1712168096',id=57,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-nx64d9kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:56:05Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=36e7106c-ee0f-41ee-a9b1-c2f98104603b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.629 183134 DEBUG nova.network.os_vif_util [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.629 183134 DEBUG nova.network.os_vif_util [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.630 183134 DEBUG nova.objects.instance [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'pci_devices' on Instance uuid 36e7106c-ee0f-41ee-a9b1-c2f98104603b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.645 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] End _get_guest_xml xml=<domain type="kvm">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <uuid>36e7106c-ee0f-41ee-a9b1-c2f98104603b</uuid>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <name>instance-00000039</name>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <memory>131072</memory>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <vcpu>1</vcpu>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <metadata>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:name>tempest-TestGettingAddress-server-1712168096</nova:name>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:creationTime>2026-01-30 09:56:08</nova:creationTime>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:flavor name="m1.nano">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:memory>128</nova:memory>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:disk>1</nova:disk>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:swap>0</nova:swap>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:ephemeral>0</nova:ephemeral>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:vcpus>1</nova:vcpus>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      </nova:flavor>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:owner>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:user uuid="4f469d29ddd6455299c7fb0220c1ffcc">tempest-TestGettingAddress-1926219776-project-member</nova:user>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:project uuid="69532d75aefe4fa6ada76bf1c1d1da9b">tempest-TestGettingAddress-1926219776</nova:project>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      </nova:owner>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:root type="image" uuid="ab7cf61b-98df-4a10-83fd-7d23191f2bba"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <nova:ports>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        <nova:port uuid="b32935e8-f47f-4a5d-978a-edd74286bcab">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe41:53dd" ipVersion="6"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:        </nova:port>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      </nova:ports>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </nova:instance>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </metadata>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <sysinfo type="smbios">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <system>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="manufacturer">RDO</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="product">OpenStack Compute</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="serial">36e7106c-ee0f-41ee-a9b1-c2f98104603b</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="uuid">36e7106c-ee0f-41ee-a9b1-c2f98104603b</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <entry name="family">Virtual Machine</entry>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </system>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </sysinfo>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <os>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <boot dev="hd"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <smbios mode="sysinfo"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </os>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <features>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <acpi/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <apic/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <vmcoreinfo/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </features>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <clock offset="utc">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <timer name="pit" tickpolicy="delay"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <timer name="hpet" present="no"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </clock>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <cpu mode="custom" match="exact">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <model>Nehalem</model>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <topology sockets="1" cores="1" threads="1"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </cpu>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  <devices>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <disk type="file" device="disk">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <target dev="vda" bus="virtio"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <disk type="file" device="cdrom">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <driver name="qemu" type="raw" cache="none"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <source file="/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.config"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <target dev="sda" bus="sata"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </disk>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <interface type="ethernet">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <mac address="fa:16:3e:41:53:dd"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <driver name="vhost" rx_queue_size="512"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <mtu size="1442"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <target dev="tapb32935e8-f4"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </interface>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <serial type="pty">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <log file="/var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/console.log" append="off"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </serial>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <video>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <model type="virtio"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </video>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <input type="tablet" bus="usb"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <rng model="virtio">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <backend model="random">/dev/urandom</backend>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </rng>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="pci" model="pcie-root-port"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <controller type="usb" index="0"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    <memballoon model="virtio">
Jan 30 04:56:08 np0005601977 nova_compute[183130]:      <stats period="10"/>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:    </memballoon>
Jan 30 04:56:08 np0005601977 nova_compute[183130]:  </devices>
Jan 30 04:56:08 np0005601977 nova_compute[183130]: </domain>
Jan 30 04:56:08 np0005601977 nova_compute[183130]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.646 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Preparing to wait for external event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.646 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.646 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.647 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.647 183134 DEBUG nova.virt.libvirt.vif [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-30T09:56:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1712168096',display_name='tempest-TestGettingAddress-server-1712168096',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1712168096',id=57,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-nx64d9kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-30T09:56:05Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=36e7106c-ee0f-41ee-a9b1-c2f98104603b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.648 183134 DEBUG nova.network.os_vif_util [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.648 183134 DEBUG nova.network.os_vif_util [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.649 183134 DEBUG os_vif [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.649 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.649 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.650 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.653 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.653 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb32935e8-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.653 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb32935e8-f4, col_values=(('external_ids', {'iface-id': 'b32935e8-f47f-4a5d-978a-edd74286bcab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:53:dd', 'vm-uuid': '36e7106c-ee0f-41ee-a9b1-c2f98104603b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.654 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:08 np0005601977 NetworkManager[55565]: <info>  [1769766968.6559] manager: (tapb32935e8-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.658 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.663 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.664 183134 INFO os_vif [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4')#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.723 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.723 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.724 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] No VIF found with MAC fa:16:3e:41:53:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.724 183134 INFO nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Using config drive#033[00m
Jan 30 04:56:08 np0005601977 nova_compute[183130]: 2026-01-30 09:56:08.790 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.085 183134 INFO nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Creating config drive at /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.config#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.091 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0lc6mj0j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.211 183134 DEBUG oslo_concurrency.processutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0lc6mj0j" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:09 np0005601977 kernel: tapb32935e8-f4: entered promiscuous mode
Jan 30 04:56:09 np0005601977 NetworkManager[55565]: <info>  [1769766969.2609] manager: (tapb32935e8-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.261 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:09Z|00558|binding|INFO|Claiming lport b32935e8-f47f-4a5d-978a-edd74286bcab for this chassis.
Jan 30 04:56:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:09Z|00559|binding|INFO|b32935e8-f47f-4a5d-978a-edd74286bcab: Claiming fa:16:3e:41:53:dd 10.100.0.13 2001:db8::f816:3eff:fe41:53dd
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.268 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:53:dd 10.100.0.13 2001:db8::f816:3eff:fe41:53dd'], port_security=['fa:16:3e:41:53:dd 10.100.0.13 2001:db8::f816:3eff:fe41:53dd'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe41:53dd/64', 'neutron:device_id': '36e7106c-ee0f-41ee-a9b1-c2f98104603b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87ba848e-f42a-46db-90b0-b00a185ea7f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aac1ed-7dad-4152-879a-9be32c3614e8, chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b32935e8-f47f-4a5d-978a-edd74286bcab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.270 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b32935e8-f47f-4a5d-978a-edd74286bcab in datapath 734c234c-1e07-4c56-b2d0-6f08a47eb16a bound to our chassis#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.271 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 734c234c-1e07-4c56-b2d0-6f08a47eb16a#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.271 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:09Z|00560|binding|INFO|Setting lport b32935e8-f47f-4a5d-978a-edd74286bcab ovn-installed in OVS
Jan 30 04:56:09 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:09Z|00561|binding|INFO|Setting lport b32935e8-f47f-4a5d-978a-edd74286bcab up in Southbound
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.274 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.279 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.284 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[cab93aa1-52b6-49ea-b0ca-49e10e4780d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 systemd-udevd[231265]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:56:09 np0005601977 systemd-machined[154431]: New machine qemu-46-instance-00000039.
Jan 30 04:56:09 np0005601977 NetworkManager[55565]: <info>  [1769766969.3042] device (tapb32935e8-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:56:09 np0005601977 systemd[1]: Started Virtual Machine qemu-46-instance-00000039.
Jan 30 04:56:09 np0005601977 NetworkManager[55565]: <info>  [1769766969.3057] device (tapb32935e8-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.311 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a732c819-2e3a-4fd4-9f78-bb0ba858525b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.317 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[8d558d94-7180-4fff-a64f-8fb79631dd0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.337 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[57706e6b-4e68-485f-8bf6-11fdde589a58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.351 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[4fda86f0-6663-4b1c-abeb-be8c11a80c04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap734c234c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:6c:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 20, 'tx_packets': 5, 'rx_bytes': 1656, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 20, 'tx_packets': 5, 'rx_bytes': 1656, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547357, 'reachable_time': 44791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1320, 'indelivers': 4, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1320, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 4, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231273, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.363 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[652799ff-f697-4423-b890-b5cd93983f7c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap734c234c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547364, 'tstamp': 547364}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231278, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap734c234c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547366, 'tstamp': 547366}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231278, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.365 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap734c234c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.366 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.367 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap734c234c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.368 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap734c234c-10, col_values=(('external_ids', {'iface-id': '5eb0c483-295e-41b8-99f9-db990f69c678'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:09 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:09.369 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.541 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766969.5407994, 36e7106c-ee0f-41ee-a9b1-c2f98104603b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.542 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] VM Started (Lifecycle Event)#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.574 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.580 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766969.541236, 36e7106c-ee0f-41ee-a9b1-c2f98104603b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.580 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] VM Paused (Lifecycle Event)#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.608 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.613 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.636 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.666 183134 DEBUG nova.compute.manager [req-f2b38ab2-ad9a-4081-9688-08c62a78e9f2 req-9556d772-048e-46d0-a66a-66b2e31cc016 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.667 183134 DEBUG oslo_concurrency.lockutils [req-f2b38ab2-ad9a-4081-9688-08c62a78e9f2 req-9556d772-048e-46d0-a66a-66b2e31cc016 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.667 183134 DEBUG oslo_concurrency.lockutils [req-f2b38ab2-ad9a-4081-9688-08c62a78e9f2 req-9556d772-048e-46d0-a66a-66b2e31cc016 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.668 183134 DEBUG oslo_concurrency.lockutils [req-f2b38ab2-ad9a-4081-9688-08c62a78e9f2 req-9556d772-048e-46d0-a66a-66b2e31cc016 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.669 183134 DEBUG nova.compute.manager [req-f2b38ab2-ad9a-4081-9688-08c62a78e9f2 req-9556d772-048e-46d0-a66a-66b2e31cc016 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Processing event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.670 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.673 183134 DEBUG nova.virt.driver [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] Emitting event <LifecycleEvent: 1769766969.6736755, 36e7106c-ee0f-41ee-a9b1-c2f98104603b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.674 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] VM Resumed (Lifecycle Event)#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.676 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.679 183134 INFO nova.virt.libvirt.driver [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Instance spawned successfully.#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.679 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.695 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.702 183134 DEBUG nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.705 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.706 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.706 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.706 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.707 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.707 183134 DEBUG nova.virt.libvirt.driver [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.731 183134 INFO nova.compute.manager [None req-f7d06989-4cef-4180-9318-046a90826455 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.773 183134 INFO nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Took 4.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.773 183134 DEBUG nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.834 183134 INFO nova.compute.manager [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Took 4.68 seconds to build instance.#033[00m
Jan 30 04:56:09 np0005601977 nova_compute[183130]: 2026-01-30 09:56:09.848 183134 DEBUG oslo_concurrency.lockutils [None req-4bd54312-14df-4b8a-acff-ce55685ad1db 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:11 np0005601977 podman[231287]: 2026-01-30 09:56:11.902961865 +0000 UTC m=+0.109039808 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.007 183134 DEBUG nova.compute.manager [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.007 183134 DEBUG oslo_concurrency.lockutils [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.008 183134 DEBUG oslo_concurrency.lockutils [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.008 183134 DEBUG oslo_concurrency.lockutils [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.009 183134 DEBUG nova.compute.manager [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] No waiting events found dispatching network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.009 183134 WARNING nova.compute.manager [req-9459bd62-c87d-44c0-8480-1225ff5d6880 req-f5c40dde-88d4-4571-b96f-2b8f18c847d8 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received unexpected event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab for instance with vm_state active and task_state None.#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.846 183134 DEBUG nova.network.neutron [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updated VIF entry in instance network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.846 183134 DEBUG nova.network.neutron [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updating instance_info_cache with network_info: [{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:12 np0005601977 nova_compute[183130]: 2026-01-30 09:56:12.862 183134 DEBUG oslo_concurrency.lockutils [req-9ee3b042-afaf-4507-95f5-b7d1e181f772 req-ef46f461-7764-4173-a1f7-6ca5ff3e9f16 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:13 np0005601977 nova_compute[183130]: 2026-01-30 09:56:13.655 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:13 np0005601977 nova_compute[183130]: 2026-01-30 09:56:13.792 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:14 np0005601977 nova_compute[183130]: 2026-01-30 09:56:14.549 183134 DEBUG nova.compute.manager [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:14 np0005601977 nova_compute[183130]: 2026-01-30 09:56:14.550 183134 DEBUG nova.compute.manager [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing instance network info cache due to event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:56:14 np0005601977 nova_compute[183130]: 2026-01-30 09:56:14.550 183134 DEBUG oslo_concurrency.lockutils [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:14 np0005601977 nova_compute[183130]: 2026-01-30 09:56:14.551 183134 DEBUG oslo_concurrency.lockutils [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:14 np0005601977 nova_compute[183130]: 2026-01-30 09:56:14.551 183134 DEBUG nova.network.neutron [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:56:16 np0005601977 nova_compute[183130]: 2026-01-30 09:56:16.094 183134 DEBUG nova.network.neutron [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updated VIF entry in instance network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:56:16 np0005601977 nova_compute[183130]: 2026-01-30 09:56:16.095 183134 DEBUG nova.network.neutron [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updating instance_info_cache with network_info: [{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:16 np0005601977 nova_compute[183130]: 2026-01-30 09:56:16.154 183134 DEBUG oslo_concurrency.lockutils [req-a7f13713-6b0f-4036-abe4-ec7484e9394f req-c67079a7-35c7-494d-aec0-1cab2e6f582b dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:18 np0005601977 nova_compute[183130]: 2026-01-30 09:56:18.659 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:18 np0005601977 nova_compute[183130]: 2026-01-30 09:56:18.794 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:20 np0005601977 podman[231335]: 2026-01-30 09:56:20.837939865 +0000 UTC m=+0.051142833 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:56:21 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:21Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:53:dd 10.100.0.13
Jan 30 04:56:21 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:21Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:53:dd 10.100.0.13
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.373 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.374 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.374 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.374 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.439 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.484 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.486 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.563 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.572 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.632 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.633 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.661 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.695 183134 DEBUG oslo_concurrency.processutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.796 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.876 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.879 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5388MB free_disk=73.18392181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.879 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.880 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.954 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 69cef024-06dd-442f-b13e-b1b446e6d2a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.955 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Instance 36e7106c-ee0f-41ee-a9b1-c2f98104603b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.956 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:56:23 np0005601977 nova_compute[183130]: 2026-01-30 09:56:23.956 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:56:24 np0005601977 nova_compute[183130]: 2026-01-30 09:56:24.010 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:56:24 np0005601977 nova_compute[183130]: 2026-01-30 09:56:24.025 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:56:24 np0005601977 nova_compute[183130]: 2026-01-30 09:56:24.051 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:56:24 np0005601977 nova_compute[183130]: 2026-01-30 09:56:24.052 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:28 np0005601977 nova_compute[183130]: 2026-01-30 09:56:28.052 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:28 np0005601977 nova_compute[183130]: 2026-01-30 09:56:28.052 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:28 np0005601977 nova_compute[183130]: 2026-01-30 09:56:28.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:28 np0005601977 nova_compute[183130]: 2026-01-30 09:56:28.664 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:28 np0005601977 nova_compute[183130]: 2026-01-30 09:56:28.799 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:29 np0005601977 nova_compute[183130]: 2026-01-30 09:56:29.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:29 np0005601977 nova_compute[183130]: 2026-01-30 09:56:29.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:29 np0005601977 podman[231373]: 2026-01-30 09:56:29.867023314 +0000 UTC m=+0.073811151 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, config_id=openstack_network_exporter)
Jan 30 04:56:29 np0005601977 podman[231374]: 2026-01-30 09:56:29.868116655 +0000 UTC m=+0.070184287 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.068 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.069 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.070 104706 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.268 183134 DEBUG nova.compute.manager [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.269 183134 DEBUG nova.compute.manager [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing instance network info cache due to event network-changed-b32935e8-f47f-4a5d-978a-edd74286bcab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.270 183134 DEBUG oslo_concurrency.lockutils [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.270 183134 DEBUG oslo_concurrency.lockutils [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.270 183134 DEBUG nova.network.neutron [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Refreshing network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.334 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.334 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.335 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.336 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.336 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.338 183134 INFO nova.compute.manager [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Terminating instance#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.340 183134 DEBUG nova.compute.manager [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:56:32 np0005601977 kernel: tapb32935e8-f4 (unregistering): left promiscuous mode
Jan 30 04:56:32 np0005601977 NetworkManager[55565]: <info>  [1769766992.3646] device (tapb32935e8-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:56:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:32Z|00562|binding|INFO|Releasing lport b32935e8-f47f-4a5d-978a-edd74286bcab from this chassis (sb_readonly=0)
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.375 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:32Z|00563|binding|INFO|Setting lport b32935e8-f47f-4a5d-978a-edd74286bcab down in Southbound
Jan 30 04:56:32 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:32Z|00564|binding|INFO|Removing iface tapb32935e8-f4 ovn-installed in OVS
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.378 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.384 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:53:dd 10.100.0.13 2001:db8::f816:3eff:fe41:53dd'], port_security=['fa:16:3e:41:53:dd 10.100.0.13 2001:db8::f816:3eff:fe41:53dd'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe41:53dd/64', 'neutron:device_id': '36e7106c-ee0f-41ee-a9b1-c2f98104603b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87ba848e-f42a-46db-90b0-b00a185ea7f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aac1ed-7dad-4152-879a-9be32c3614e8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=b32935e8-f47f-4a5d-978a-edd74286bcab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.386 104706 INFO neutron.agent.ovn.metadata.agent [-] Port b32935e8-f47f-4a5d-978a-edd74286bcab in datapath 734c234c-1e07-4c56-b2d0-6f08a47eb16a unbound from our chassis#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.387 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.389 104706 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 734c234c-1e07-4c56-b2d0-6f08a47eb16a#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.405 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb84c17-6ec4-414e-8e11-548abb80346f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 30 04:56:32 np0005601977 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000039.scope: Consumed 12.358s CPU time.
Jan 30 04:56:32 np0005601977 systemd-machined[154431]: Machine qemu-46-instance-00000039 terminated.
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.437 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[a5017f2a-b711-4e8f-8c9e-238d73a54cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.442 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[c6dd4435-75d6-4844-a7d5-e1945b1f93d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.470 211854 DEBUG oslo.privsep.daemon [-] privsep: reply[204e7342-0e13-45c2-b8f5-014e69d33088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.486 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a79c1d07-a62e-4894-a78e-d0c949790361]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap734c234c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:6c:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 7, 'rx_bytes': 2780, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 7, 'rx_bytes': 2780, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547357, 'reachable_time': 44791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 30, 'inoctets': 2192, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 30, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2192, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 30, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231425, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.503 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[89adf486-6013-44d3-8812-311ffb280426]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap734c234c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547364, 'tstamp': 547364}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231426, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap734c234c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547366, 'tstamp': 547366}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231426, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.505 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap734c234c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.507 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.511 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.512 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap734c234c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.512 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.512 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap734c234c-10, col_values=(('external_ids', {'iface-id': '5eb0c483-295e-41b8-99f9-db990f69c678'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:32 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:32.513 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.561 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.566 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.602 183134 INFO nova.virt.libvirt.driver [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Instance destroyed successfully.#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.603 183134 DEBUG nova.objects.instance [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 36e7106c-ee0f-41ee-a9b1-c2f98104603b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.621 183134 DEBUG nova.virt.libvirt.vif [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:56:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1712168096',display_name='tempest-TestGettingAddress-server-1712168096',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1712168096',id=57,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:56:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-nx64d9kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:56:09Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=36e7106c-ee0f-41ee-a9b1-c2f98104603b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.622 183134 DEBUG nova.network.os_vif_util [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.623 183134 DEBUG nova.network.os_vif_util [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.624 183134 DEBUG os_vif [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.626 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.627 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb32935e8-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.628 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.630 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.634 183134 INFO os_vif [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:53:dd,bridge_name='br-int',has_traffic_filtering=True,id=b32935e8-f47f-4a5d-978a-edd74286bcab,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb32935e8-f4')#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.635 183134 INFO nova.virt.libvirt.driver [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Deleting instance files /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b_del#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.636 183134 INFO nova.virt.libvirt.driver [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Deletion of /var/lib/nova/instances/36e7106c-ee0f-41ee-a9b1-c2f98104603b_del complete#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.703 183134 INFO nova.compute.manager [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.705 183134 DEBUG oslo.service.loopingcall [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.705 183134 DEBUG nova.compute.manager [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:56:32 np0005601977 nova_compute[183130]: 2026-01-30 09:56:32.705 183134 DEBUG nova.network.neutron [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.367 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.610 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.611 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquired lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.611 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.612 183134 DEBUG nova.objects.instance [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 69cef024-06dd-442f-b13e-b1b446e6d2a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:56:33 np0005601977 nova_compute[183130]: 2026-01-30 09:56:33.802 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:34 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:34.071 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9be64184-856f-4986-a80e-9403fa35a6a5, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.138 183134 DEBUG nova.network.neutron [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.162 183134 INFO nova.compute.manager [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Took 1.46 seconds to deallocate network for instance.#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.235 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.236 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.322 183134 DEBUG nova.compute.provider_tree [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.344 183134 DEBUG nova.scheduler.client.report [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.364 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.371 183134 DEBUG nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-vif-unplugged-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.372 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.372 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.373 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.373 183134 DEBUG nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] No waiting events found dispatching network-vif-unplugged-b32935e8-f47f-4a5d-978a-edd74286bcab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.374 183134 WARNING nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received unexpected event network-vif-unplugged-b32935e8-f47f-4a5d-978a-edd74286bcab for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.374 183134 DEBUG nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.374 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.375 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.375 183134 DEBUG oslo_concurrency.lockutils [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.376 183134 DEBUG nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] No waiting events found dispatching network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.376 183134 WARNING nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received unexpected event network-vif-plugged-b32935e8-f47f-4a5d-978a-edd74286bcab for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.376 183134 DEBUG nova.compute.manager [req-ae207578-a947-48de-8a9f-23b2151ff730 req-1eb8ffa9-815e-4602-9f62-6af8ed6d2222 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Received event network-vif-deleted-b32935e8-f47f-4a5d-978a-edd74286bcab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.405 183134 INFO nova.scheduler.client.report [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 36e7106c-ee0f-41ee-a9b1-c2f98104603b#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.465 183134 DEBUG oslo_concurrency.lockutils [None req-b96ac42f-1725-4a5b-8e2d-b0f8f65467dc 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "36e7106c-ee0f-41ee-a9b1-c2f98104603b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.900 183134 DEBUG nova.network.neutron [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updated VIF entry in instance network info cache for port b32935e8-f47f-4a5d-978a-edd74286bcab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.901 183134 DEBUG nova.network.neutron [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Updating instance_info_cache with network_info: [{"id": "b32935e8-f47f-4a5d-978a-edd74286bcab", "address": "fa:16:3e:41:53:dd", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe41:53dd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb32935e8-f4", "ovs_interfaceid": "b32935e8-f47f-4a5d-978a-edd74286bcab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:34 np0005601977 nova_compute[183130]: 2026-01-30 09:56:34.923 183134 DEBUG oslo_concurrency.lockutils [req-2a33a948-a5b7-4569-bb22-cf6a19d63c73 req-f2731d35-b6a4-4c47-91a7-7176f79bc681 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-36e7106c-ee0f-41ee-a9b1-c2f98104603b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.550 183134 DEBUG nova.network.neutron [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [{"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.586 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Releasing lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.587 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.588 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.589 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.607 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.608 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.608 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.609 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.609 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.612 183134 INFO nova.compute.manager [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Terminating instance#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.614 183134 DEBUG nova.compute.manager [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:56:35 np0005601977 kernel: tap8158beb1-bb (unregistering): left promiscuous mode
Jan 30 04:56:35 np0005601977 NetworkManager[55565]: <info>  [1769766995.6523] device (tap8158beb1-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.653 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:35Z|00565|binding|INFO|Releasing lport 8158beb1-bb0c-4018-b87f-889a7f7bfc30 from this chassis (sb_readonly=0)
Jan 30 04:56:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:35Z|00566|binding|INFO|Setting lport 8158beb1-bb0c-4018-b87f-889a7f7bfc30 down in Southbound
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.661 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 ovn_controller[95460]: 2026-01-30T09:56:35Z|00567|binding|INFO|Removing iface tap8158beb1-bb ovn-installed in OVS
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.663 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.670 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.671 104706 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:90:c9 10.100.0.6 2001:db8::f816:3eff:fee6:90c9'], port_security=['fa:16:3e:e6:90:c9 10.100.0.6 2001:db8::f816:3eff:fee6:90c9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fee6:90c9/64', 'neutron:device_id': '69cef024-06dd-442f-b13e-b1b446e6d2a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87ba848e-f42a-46db-90b0-b00a185ea7f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aac1ed-7dad-4152-879a-9be32c3614e8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>], logical_port=8158beb1-bb0c-4018-b87f-889a7f7bfc30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4026ceab20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.673 104706 INFO neutron.agent.ovn.metadata.agent [-] Port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 in datapath 734c234c-1e07-4c56-b2d0-6f08a47eb16a unbound from our chassis#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.675 104706 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 734c234c-1e07-4c56-b2d0-6f08a47eb16a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.677 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a5869a5c-b8d3-4bea-b9f2-5ae449dc44ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.678 104706 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a namespace which is not needed anymore#033[00m
Jan 30 04:56:35 np0005601977 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 30 04:56:35 np0005601977 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000038.scope: Consumed 13.710s CPU time.
Jan 30 04:56:35 np0005601977 systemd-machined[154431]: Machine qemu-45-instance-00000038 terminated.
Jan 30 04:56:35 np0005601977 podman[231447]: 2026-01-30 09:56:35.763436329 +0000 UTC m=+0.069589400 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:56:35 np0005601977 podman[231446]: 2026-01-30 09:56:35.795063033 +0000 UTC m=+0.103284553 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:56:35 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [NOTICE]   (231099) : haproxy version is 2.8.14-c23fe91
Jan 30 04:56:35 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [NOTICE]   (231099) : path to executable is /usr/sbin/haproxy
Jan 30 04:56:35 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [WARNING]  (231099) : Exiting Master process...
Jan 30 04:56:35 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [ALERT]    (231099) : Current worker (231101) exited with code 143 (Terminated)
Jan 30 04:56:35 np0005601977 neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a[231095]: [WARNING]  (231099) : All workers exited. Exiting... (0)
Jan 30 04:56:35 np0005601977 systemd[1]: libpod-684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa.scope: Deactivated successfully.
Jan 30 04:56:35 np0005601977 conmon[231095]: conmon 684a7b0327134cfd3c0c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa.scope/container/memory.events
Jan 30 04:56:35 np0005601977 podman[231509]: 2026-01-30 09:56:35.841079429 +0000 UTC m=+0.054713765 container died 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.841 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.845 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa-userdata-shm.mount: Deactivated successfully.
Jan 30 04:56:35 np0005601977 systemd[1]: var-lib-containers-storage-overlay-b51a5db252030745d67f4cc7aa85671eec2009da545d6c059227ecdc98fb9c84-merged.mount: Deactivated successfully.
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.882 183134 INFO nova.virt.libvirt.driver [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Instance destroyed successfully.#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.883 183134 DEBUG nova.objects.instance [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'resources' on Instance uuid 69cef024-06dd-442f-b13e-b1b446e6d2a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:56:35 np0005601977 podman[231509]: 2026-01-30 09:56:35.88834508 +0000 UTC m=+0.101979386 container cleanup 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:56:35 np0005601977 systemd[1]: libpod-conmon-684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa.scope: Deactivated successfully.
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.909 183134 DEBUG nova.virt.libvirt.vif [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-30T09:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-292157088',display_name='tempest-TestGettingAddress-server-292157088',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-292157088',id=56,image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBINmXbTK7RjlQWBIWVNlehNtZnYEj57SLVKa2MAZNGWc2mcP5LaL/F98DiZ9YsaDDMSAQJPNQdTMZxaIBbPw00fR7PHLDID9O61doK2J20TKkHdwCrWMRj/YDbFYIk8f6Q==',key_name='tempest-TestGettingAddress-1467555575',keypairs=<?>,launch_index=0,launched_at=2026-01-30T09:55:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='69532d75aefe4fa6ada76bf1c1d1da9b',ramdisk_id='',reservation_id='r-7adzda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ab7cf61b-98df-4a10-83fd-7d23191f2bba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1926219776',owner_user_name='tempest-TestGettingAddress-1926219776-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-30T09:55:43Z,user_data=None,user_id='4f469d29ddd6455299c7fb0220c1ffcc',uuid=69cef024-06dd-442f-b13e-b1b446e6d2a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.909 183134 DEBUG nova.network.os_vif_util [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converting VIF {"id": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "address": "fa:16:3e:e6:90:c9", "network": {"id": "734c234c-1e07-4c56-b2d0-6f08a47eb16a", "bridge": "br-int", "label": "tempest-network-smoke--621123478", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fee6:90c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "69532d75aefe4fa6ada76bf1c1d1da9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8158beb1-bb", "ovs_interfaceid": "8158beb1-bb0c-4018-b87f-889a7f7bfc30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.910 183134 DEBUG nova.network.os_vif_util [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.911 183134 DEBUG os_vif [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.913 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.913 183134 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8158beb1-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.914 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.916 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.918 183134 INFO os_vif [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:90:c9,bridge_name='br-int',has_traffic_filtering=True,id=8158beb1-bb0c-4018-b87f-889a7f7bfc30,network=Network(734c234c-1e07-4c56-b2d0-6f08a47eb16a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8158beb1-bb')#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.919 183134 INFO nova.virt.libvirt.driver [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Deleting instance files /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7_del#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.919 183134 INFO nova.virt.libvirt.driver [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Deletion of /var/lib/nova/instances/69cef024-06dd-442f-b13e-b1b446e6d2a7_del complete#033[00m
Jan 30 04:56:35 np0005601977 podman[231557]: 2026-01-30 09:56:35.963677173 +0000 UTC m=+0.056744963 container remove 684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.968 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed748c1-a79f-4094-9c6c-5f2cf99c436c]: (4, ('Fri Jan 30 09:56:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a (684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa)\n684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa\nFri Jan 30 09:56:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a (684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa)\n684a7b0327134cfd3c0ce80ecd4618db791c0c66cb6e78b5332a2e2eecbb09fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.973 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[bafcf87e-f6c8-41a0-bec0-79651c9eabd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.975 104706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap734c234c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.978 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 kernel: tap734c234c-10: left promiscuous mode
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.985 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.988 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:35 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:35.994 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9830a5-7641-4732-8fd9-82738546d667]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:35 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.999 183134 DEBUG nova.compute.manager [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-unplugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:35.999 183134 DEBUG oslo_concurrency.lockutils [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.001 183134 DEBUG oslo_concurrency.lockutils [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.001 183134 DEBUG oslo_concurrency.lockutils [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.003 183134 DEBUG nova.compute.manager [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] No waiting events found dispatching network-vif-unplugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.004 183134 DEBUG nova.compute.manager [req-c029a082-8591-4969-b89f-2a255d4be57c req-b32f3722-1726-43f1-9d2d-2da3e59db0f6 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-unplugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 30 04:56:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:36.018 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[136ac307-969b-49d1-8940-e136d67eaed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:36.021 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[02d175d7-96cd-4920-9c34-cb54ae423cbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.021 183134 INFO nova.compute.manager [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.022 183134 DEBUG oslo.service.loopingcall [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.023 183134 DEBUG nova.compute.manager [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.023 183134 DEBUG nova.network.neutron [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:56:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:36.040 211716 DEBUG oslo.privsep.daemon [-] privsep: reply[a79d412f-c0d6-4737-ad34-b1123425ab70]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547353, 'reachable_time': 23758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231572, 'error': None, 'target': 'ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:36 np0005601977 systemd[1]: run-netns-ovnmeta\x2d734c234c\x2d1e07\x2d4c56\x2db2d0\x2d6f08a47eb16a.mount: Deactivated successfully.
Jan 30 04:56:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:36.045 105085 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-734c234c-1e07-4c56-b2d0-6f08a47eb16a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 30 04:56:36 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:36.046 105085 DEBUG oslo.privsep.daemon [-] privsep: reply[304bab12-81c0-4dcb-a648-da7e767ab1d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.448 183134 DEBUG nova.compute.manager [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.448 183134 DEBUG nova.compute.manager [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing instance network info cache due to event network-changed-8158beb1-bb0c-4018-b87f-889a7f7bfc30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.449 183134 DEBUG oslo_concurrency.lockutils [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.449 183134 DEBUG oslo_concurrency.lockutils [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.449 183134 DEBUG nova.network.neutron [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Refreshing network info cache for port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.544 183134 DEBUG nova.network.neutron [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.564 183134 INFO nova.compute.manager [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Took 0.54 seconds to deallocate network for instance.#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.610 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.610 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.649 183134 DEBUG nova.compute.provider_tree [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.661 183134 DEBUG nova.scheduler.client.report [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.679 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.699 183134 INFO nova.scheduler.client.report [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 69cef024-06dd-442f-b13e-b1b446e6d2a7#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.710 183134 INFO nova.network.neutron [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Port 8158beb1-bb0c-4018-b87f-889a7f7bfc30 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.710 183134 DEBUG nova.network.neutron [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.743 183134 DEBUG oslo_concurrency.lockutils [req-d3f27cac-dc20-468e-b5f3-26d097822dfd req-00f7f41a-ba80-4b67-b78b-f78f0aec4f0d dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-69cef024-06dd-442f-b13e-b1b446e6d2a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:56:36 np0005601977 nova_compute[183130]: 2026-01-30 09:56:36.781 183134 DEBUG oslo_concurrency.lockutils [None req-20d8a91e-a8de-4d00-b047-e08fdabb57b4 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.130 183134 DEBUG nova.compute.manager [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.130 183134 DEBUG oslo_concurrency.lockutils [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.131 183134 DEBUG oslo_concurrency.lockutils [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.131 183134 DEBUG oslo_concurrency.lockutils [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Lock "69cef024-06dd-442f-b13e-b1b446e6d2a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.131 183134 DEBUG nova.compute.manager [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] No waiting events found dispatching network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.132 183134 WARNING nova.compute.manager [req-4e2a089b-a23a-4821-bf91-9ba2b80c0d8a req-2d12777f-3ea2-43de-8054-5e184f44e217 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received unexpected event network-vif-plugged-8158beb1-bb0c-4018-b87f-889a7f7bfc30 for instance with vm_state deleted and task_state None.#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.535 183134 DEBUG nova.compute.manager [req-20b0cf86-20da-4e82-b51e-4d5c3f21ecc9 req-7b3054ad-efbd-4781-97bc-97af43ca9281 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Received event network-vif-deleted-8158beb1-bb0c-4018-b87f-889a7f7bfc30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:56:38 np0005601977 nova_compute[183130]: 2026-01-30 09:56:38.805 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:39 np0005601977 nova_compute[183130]: 2026-01-30 09:56:39.927 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:39 np0005601977 nova_compute[183130]: 2026-01-30 09:56:39.965 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:40 np0005601977 nova_compute[183130]: 2026-01-30 09:56:40.586 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:40 np0005601977 nova_compute[183130]: 2026-01-30 09:56:40.915 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:42 np0005601977 podman[231574]: 2026-01-30 09:56:42.887755404 +0000 UTC m=+0.102782159 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:56:43 np0005601977 nova_compute[183130]: 2026-01-30 09:56:43.807 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:45 np0005601977 nova_compute[183130]: 2026-01-30 09:56:45.918 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:47 np0005601977 nova_compute[183130]: 2026-01-30 09:56:47.602 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766992.600204, 36e7106c-ee0f-41ee-a9b1-c2f98104603b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:56:47 np0005601977 nova_compute[183130]: 2026-01-30 09:56:47.602 183134 INFO nova.compute.manager [-] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:56:47 np0005601977 nova_compute[183130]: 2026-01-30 09:56:47.622 183134 DEBUG nova.compute.manager [None req-f04b3f9e-7a60-47f0-9b33-d310e3a31ca1 - - - - - -] [instance: 36e7106c-ee0f-41ee-a9b1-c2f98104603b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:48 np0005601977 nova_compute[183130]: 2026-01-30 09:56:48.809 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:50 np0005601977 nova_compute[183130]: 2026-01-30 09:56:50.879 183134 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769766995.8762615, 69cef024-06dd-442f-b13e-b1b446e6d2a7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 30 04:56:50 np0005601977 nova_compute[183130]: 2026-01-30 09:56:50.880 183134 INFO nova.compute.manager [-] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] VM Stopped (Lifecycle Event)#033[00m
Jan 30 04:56:50 np0005601977 nova_compute[183130]: 2026-01-30 09:56:50.903 183134 DEBUG nova.compute.manager [None req-62631622-c884-4603-b4ad-18f44a4b9460 - - - - - -] [instance: 69cef024-06dd-442f-b13e-b1b446e6d2a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 30 04:56:50 np0005601977 nova_compute[183130]: 2026-01-30 09:56:50.921 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:51 np0005601977 podman[231601]: 2026-01-30 09:56:51.85720694 +0000 UTC m=+0.067939973 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.343 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.343 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.344 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.344 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.344 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.344 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.379 183134 DEBUG nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.379 183134 WARNING nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.380 183134 WARNING nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.380 183134 WARNING nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.380 183134 INFO nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Removable base files: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4 /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.381 183134 INFO nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.381 183134 INFO nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f3bdd19f58c6dd32802b100d2363d205d4b05be4#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.381 183134 INFO nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/e5be927f4d5d3cf8a551fcd7e66a81d6274021ec#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.381 183134 DEBUG nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.382 183134 DEBUG nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.382 183134 DEBUG nova.virt.libvirt.imagecache [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 30 04:56:53 np0005601977 nova_compute[183130]: 2026-01-30 09:56:53.866 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:55 np0005601977 nova_compute[183130]: 2026-01-30 09:56:55.923 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:56:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:57.412 104706 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:57.412 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:57 np0005601977 ovn_metadata_agent[104701]: 2026-01-30 09:56:57.412 104706 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:58 np0005601977 nova_compute[183130]: 2026-01-30 09:56:58.869 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:00 np0005601977 podman[231625]: 2026-01-30 09:57:00.856474079 +0000 UTC m=+0.066247935 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.7, io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, release=1769056855, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:57:00 np0005601977 podman[231626]: 2026-01-30 09:57:00.867194605 +0000 UTC m=+0.071220487 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 30 04:57:00 np0005601977 nova_compute[183130]: 2026-01-30 09:57:00.951 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:03 np0005601977 nova_compute[183130]: 2026-01-30 09:57:03.870 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:05 np0005601977 nova_compute[183130]: 2026-01-30 09:57:05.953 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:06 np0005601977 podman[231663]: 2026-01-30 09:57:06.835820964 +0000 UTC m=+0.048613660 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:57:06 np0005601977 podman[231662]: 2026-01-30 09:57:06.8398606 +0000 UTC m=+0.056404034 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 30 04:57:08 np0005601977 nova_compute[183130]: 2026-01-30 09:57:08.871 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:10 np0005601977 ovn_controller[95460]: 2026-01-30T09:57:10Z|00568|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 30 04:57:10 np0005601977 nova_compute[183130]: 2026-01-30 09:57:10.954 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:13 np0005601977 nova_compute[183130]: 2026-01-30 09:57:13.871 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:13 np0005601977 podman[231703]: 2026-01-30 09:57:13.878819123 +0000 UTC m=+0.092859115 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:57:15 np0005601977 nova_compute[183130]: 2026-01-30 09:57:15.955 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:18 np0005601977 nova_compute[183130]: 2026-01-30 09:57:18.872 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:20 np0005601977 nova_compute[183130]: 2026-01-30 09:57:20.958 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:22 np0005601977 podman[231729]: 2026-01-30 09:57:22.82942271 +0000 UTC m=+0.050661539 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.383 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.417 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.417 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.418 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.418 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.607 183134 WARNING nova.virt.libvirt.driver [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.609 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.24177551269531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.609 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.609 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.672 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.673 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.693 183134 DEBUG nova.compute.provider_tree [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed in ProviderTree for provider: eb11f67d-14b4-46ee-89fd-92936c45ed58 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.708 183134 DEBUG nova.scheduler.client.report [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Inventory has not changed for provider eb11f67d-14b4-46ee-89fd-92936c45ed58 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.731 183134 DEBUG nova.compute.resource_tracker [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.732 183134 DEBUG oslo_concurrency.lockutils [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:23 np0005601977 nova_compute[183130]: 2026-01-30 09:57:23.874 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:25 np0005601977 nova_compute[183130]: 2026-01-30 09:57:25.958 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:28 np0005601977 nova_compute[183130]: 2026-01-30 09:57:28.693 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:28 np0005601977 nova_compute[183130]: 2026-01-30 09:57:28.694 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:28 np0005601977 nova_compute[183130]: 2026-01-30 09:57:28.877 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:29 np0005601977 nova_compute[183130]: 2026-01-30 09:57:29.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:30 np0005601977 nova_compute[183130]: 2026-01-30 09:57:30.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:30 np0005601977 nova_compute[183130]: 2026-01-30 09:57:30.960 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:31 np0005601977 nova_compute[183130]: 2026-01-30 09:57:31.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:31 np0005601977 podman[231755]: 2026-01-30 09:57:31.844050796 +0000 UTC m=+0.061595102 container health_status b110e712af4e364deeff10b6e5870cda63df255389b0450b2366a61e4b05b941 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1769056855, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 30 04:57:31 np0005601977 podman[231756]: 2026-01-30 09:57:31.851594202 +0000 UTC m=+0.062682463 container health_status b47f7c46186caca2dcda1489f4ccdd5d81d72c2706dfc72aaa1a5ba0de280d91 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:57:33 np0005601977 nova_compute[183130]: 2026-01-30 09:57:33.881 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.342 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.343 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.391 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.392 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.392 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.393 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:34 np0005601977 nova_compute[183130]: 2026-01-30 09:57:34.393 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:57:35 np0005601977 nova_compute[183130]: 2026-01-30 09:57:35.962 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:36 np0005601977 nova_compute[183130]: 2026-01-30 09:57:36.408 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:37 np0005601977 podman[231795]: 2026-01-30 09:57:37.871644102 +0000 UTC m=+0.085150315 container health_status 9cfd696a39ae4568bca4e65cf66aca47b2b03240e24e05b495855b9f5dc16e4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:57:37 np0005601977 podman[231796]: 2026-01-30 09:57:37.88627183 +0000 UTC m=+0.097118577 container health_status c03b488b770b88b1f6d1ae2c89971085158b73ca897a276030f979aee63b1e32 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:57:38 np0005601977 nova_compute[183130]: 2026-01-30 09:57:38.881 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:40 np0005601977 nova_compute[183130]: 2026-01-30 09:57:40.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:40 np0005601977 nova_compute[183130]: 2026-01-30 09:57:40.344 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:40 np0005601977 nova_compute[183130]: 2026-01-30 09:57:40.964 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:43 np0005601977 nova_compute[183130]: 2026-01-30 09:57:43.917 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:44 np0005601977 systemd-logind[809]: New session 29 of user zuul.
Jan 30 04:57:44 np0005601977 systemd[1]: Started Session 29 of User zuul.
Jan 30 04:57:44 np0005601977 nova_compute[183130]: 2026-01-30 09:57:44.382 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:44 np0005601977 podman[231838]: 2026-01-30 09:57:44.42061959 +0000 UTC m=+0.111338233 container health_status 92bfccb492fa83254b973775e6bc7e8ab61e63a59cf255557d47e6a9edb70851 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:57:45 np0005601977 nova_compute[183130]: 2026-01-30 09:57:45.967 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:48 np0005601977 nova_compute[183130]: 2026-01-30 09:57:48.343 183134 DEBUG oslo_service.periodic_task [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:48 np0005601977 nova_compute[183130]: 2026-01-30 09:57:48.344 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:57:48 np0005601977 nova_compute[183130]: 2026-01-30 09:57:48.361 183134 DEBUG nova.compute.manager [None req-d3f97742-d89a-456d-a77a-985a900cff76 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:57:48 np0005601977 nova_compute[183130]: 2026-01-30 09:57:48.918 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:50 np0005601977 ovs-vsctl[232075]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 30 04:57:50 np0005601977 nova_compute[183130]: 2026-01-30 09:57:50.970 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:51 np0005601977 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 231890 (sos)
Jan 30 04:57:51 np0005601977 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 30 04:57:51 np0005601977 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 30 04:57:51 np0005601977 virtqemud[182587]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 30 04:57:51 np0005601977 virtqemud[182587]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 30 04:57:51 np0005601977 virtqemud[182587]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 30 04:57:53 np0005601977 podman[232565]: 2026-01-30 09:57:53.900893597 +0000 UTC m=+0.107387720 container health_status 2350402c887beb3095bcdd56cca1cfa0a2cb6a36674ee161ea0dcc9bd504fb13 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1e558703ba13d080d8dd2db4aae028647023a2082a95a222c9abf744b27bc4da-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:57:53 np0005601977 nova_compute[183130]: 2026-01-30 09:57:53.920 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 30 04:57:54 np0005601977 systemd[1]: Starting Hostname Service...
Jan 30 04:57:54 np0005601977 systemd[1]: Started Hostname Service.
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.455 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.456 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.457 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 ceilometer_agent_compute[192799]: 2026-01-30 09:57:55.458 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:57:55 np0005601977 nova_compute[183130]: 2026-01-30 09:57:55.972 183134 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
